Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
“AI:” “Wait you want to know what the last day on human earth looks like when he…
ytc_UgytHceN1…
G
We have a model for creating intelligence that supersedes our own: children!
A…
ytc_Ugxm9d2kE…
G
This will happen slowly. It’s been creeping up on society for years.
Which is b…
ytc_Ugz3tcBdS…
G
Let me know when Leonardo Dicaprio is smuggling weapons to Earth First to sabota…
rdc_esp1ja8
G
They do not want to regulate it, they seek to train it, and have already begun d…
ytc_UgykYaq9H…
G
3 year old me's drawing of my brother will always be better than the most breath…
ytc_UgypwYhZ4…
G
its really telling that as i watched this video as backround noise while sewing …
ytc_UgzIg-s70…
G
Great presentation. AI is our partner in everything including teaching security…
ytc_UgxGj2mhP…
Comment
I think it it will do what it's taught is best. If we're not teaching it value for the human life, then how will it learn? It needs to be brainwashed to believe that humans are detrimental to even their own existence, that we are the utmost important component of life, that every single human life needs to be protected. It needs something similar to the Ten Commandments mandatorily programmed in every model. A Code of Ethics. If we're going to create something and equip it with tools that could be used against, we should at least create it to not want to.
There are some humans that would much rather remove those ants, and that's because of compassion. Unfortunately with us, some have it some don't. But if we had the ability to "program" people, everyone would. We have the ability to program AI.
youtube
AI Governance
2025-10-04T01:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugxd6tb0mcBPAFB6q_p4AaABAg.ANs-vhVBzp3ANwSq_Y113w","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytr_UgzTtu-5wdkpZHWiaap4AaABAg.ANqls69G3lRANspG5ACv6p","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytr_UgxGK8O6H57EdRHIuvp4AaABAg.ANqiRcd0e1BANsr9vyIaf3","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANoXlKVUGxO","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANpTWQ6hEuc","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANq7fuyysxH","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwEdMHcoL9eRZFpZ6B4AaABAg.ANo6bZr2h0xANq7xFkWLN2","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgwN1ljQ2pitfzFPVr14AaABAg.ANmVl2TfD6OANq1ZNvNc4W","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgwN1ljQ2pitfzFPVr14AaABAg.ANmVl2TfD6OANqP_3ZvdhS","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_UgybxWrl9nJzsKEejot4AaABAg.ANmPP7QKWYsANmS3TL3Mgr","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}
]