Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don’t like driverless car technology, but this is just wrong to destroy somebo…
ytc_UgyaJEPdr…
G
literally the only “art” that i’d call bad
i do not care if people only draw st…
ytr_UgyyrUm60…
G
That gives me an idea, I have a chatgpt bot on my Viber messaging app. For fun I…
ytr_UgyNfgQj2…
G
I like to believe that ai just generates images while us humans make art cause t…
ytc_UgwJd763G…
G
Those GenZee people report that AI is sentient because they are as sentient as A…
ytc_UgwbxRfJE…
G
Show any AI a picture of a chicken with 3 legs, it will tell you it is a normal …
ytc_UgxnfWqjj…
G
by coming up with regulation around this AI tech you keep people from wanting t…
ytc_Ugx1vzrJw…
G
the fact that sam altman himself said that people will lose their jobs because o…
ytc_Ugx3XKs2K…
Comment
ChatGPT and similar tech is totally uncharted territory for humanity. Sam Altman and these tech bros are high on their own supply and have zero regard for any potential downside to the products they peddle. And the unconditional support and encouragement from ChatGPT, even in the face of imminent suicide, is a feature not a bug of the algorithm.
youtube
AI Harm Incident
2025-11-08T03:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugw4YDq4wN0QKVa5KRh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyPmsR-M_HYP3jGMUN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwEgAMfdyTEsOoot4V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"sadness"},
{"id":"ytc_UgwNMeFt302OzbYEWNN4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzT4DQohTQXG0Jakct4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3OuysOBhnzrtI-gB4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwfSSN0fJdbjqg9Wnh4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxSLohrpIkb03ppA3d4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyHXMh_6qbLjUOiyD14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwAyYhMSMKQIROC1kx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}
]