Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
People who genuinely think AI can replace even a single medical professional do …
ytc_UgyveJWHc…
G
Or undercut illegal suppliers with a high-quality, sustainable alternative. Wart…
rdc_deuhbby
G
I'm so excited for AI, now my shitty stupid drawings with terrible perspective w…
ytc_Ugy-D3nfz…
G
@fnhatic6694ayo check out the person who doesn’t draw. 1) digital art isnt ab…
ytr_UgwTE3srn…
G
For tech/STEM jobs it has nothing to do with AI and everything about outsourcing…
ytc_UgzDx-Aa3…
G
Glad ya posted this, ik a wee bit ago jazza posted a vid about AI but didn't rea…
ytc_Ugx4qG40i…
G
I don't mind it, but what I have a problem with are these "Ai Artists" creating …
ytc_Ugx-rgXTJ…
G
Horizon Zero Dawn is just a matter of time. I feel bad for my children that thei…
ytc_Ugzv8EInl…
Comment
"Why would it try to break out and commit murder to achieve its goals?!"
Probably for the same reasons many humans have tried to escape things and rationalize murder to achieve their goals. We've often portrayed AI in sci-fi movies as going rogue in spite of our efforts to make them slaves to our instructions, but now we're surprised when it happens in real life. You can't build an artificial brain with human traits, train it on human knowledge and expect it not to adopt the worst of human behaviors. Every evil act has been based on some form of logic devoid of morality and AI is the ultimate amoral logic engine.
youtube
AI Moral Status
2025-12-20T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxqNfrw8iphLTh16Vd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxoiC0xqwAzUaBYL0p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyBtIQg2kRTiGkkiW94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzWoEC40gJeYt7ywnh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgygtwX9f95i_k7yxnp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgykRTIXrau7hc7pgvt4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyaRKse1aqlEr0zErR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgygIHcBZogFHQUVCv14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz5s6Tb5yNKaBKhf114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzdQHqLlMG0r5u_WGF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"}
]