Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is still crude in some applications, such as medical coding, say. But the in…
ytc_UgwlMogCe…
G
It’s not evil, it’s emotionless. It is the epitome of indifference towards you. …
ytc_UgzAE7Jor…
G
Thats why Glaze and Nightshade exists
Its a Copyright protection, that will stu…
ytr_UgxODws3S…
G
“To show the world I can also do this activity” dawg you aren’t DOING anything o…
ytc_Ugxuou8lY…
G
The problem is not AI taking menial jobs to free up time of humans to put toward…
ytc_UgyVYEVXq…
G
AI still will be better than humans see whats goin on in world right now. we hav…
ytc_Ugx56I858…
G
We're at a point now where only a revolution will save us, but everyone is to co…
ytc_UgwR5sXJm…
G
The issue with AI that nobody understands is that it defines or will eventually …
ytc_UgyS27CgN…
Comment
You train AI on all of humanity, of course there's going to be a monster in it.
There's a massive monster inside of all of humanity.
Watch some war documentaries some times, prison videos, heck just watch red pill dating videos lol.
youtube
AI Moral Status
2025-12-16T06:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy_EsRwWhiHz5m_GPl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyq-o_mbQLSnC20AjF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxPmX5XJO4ENh8QJpt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy3XJnMjeu7eYVAhPB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-ImwdEeQmxa99MKR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugy4-7LE6AY4Gbe36pZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwHfg8wjoo7hh_83PN4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwkSY4TA5RCvMaHVbB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1PDBCHiliNYw9F2F4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugwa3tlM-fVklrrDAsN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]