Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We can see whats coming. We need to create a sub society with an alternatice.cur…
ytc_Ugzoqumcl…
G
Isn't it like being a narcissistic psycopath? In the sense of having cognitive e…
ytc_UgyxlIDMx…
G
Governments will never make difficult decisions until every single voter is scre…
ytc_UgyC69xDx…
G
We appreciate your engagement with our content! If you're intrigued by AI techno…
ytr_Ugx3bzrV_…
G
Well, I think the way it's going to work might be simple, but also very weird. I…
ytc_UgzGLRgVn…
G
I don’t think anime deserves to be called art either, not nearly as bad as AI, b…
ytc_UgyiAKJQn…
G
Quality video and quality take. Your point about AI content not being enjoyed is…
ytc_UgxEvw6j-…
G
Downvoting not because this is fundamentally wrong. But beacause it's a low effo…
ytc_Ugw7H8T6O…
Comment
AI will be the best of us, and the worst of us, because of us. In Dark Nature, biologist Lyall Watson explored the emergence of evil with the emergence of Consciousness. The faster, more complex the mind -- the faster the emergence, and increased scale of potential evil. This troubling circumstance may be why humans have yet to find other civilizations that survived achieving AI. As with much of the history of science the priority is playing God and asking "can we" rather than "should we".
youtube
AI Harm Incident
2025-07-26T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwxi3e5bXAEEbVaR3Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx8sH5KPD0RcwPnQ0N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy88rTkpmdSgWMEccV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwfREHqYTmR_eyZ0z94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyYqAhz1yJY_3oC1tl4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwHKXxzJ2eZVIv2Qb54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw4jHpkidSn9RaK4ex4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy-kkbBdtcvD5cAyoh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzDb6k0KWFnkCWpbNd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyOVUtBWbyRI7xHH714AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]