Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Terrible idea. It teaches kids to trust anything the comoiter says without quest…
ytc_UgzSmh8KP…
G
I notice you didn't ask "Dan" how to make a bomb. CHATGPT is just role-playing, …
ytc_UgzTu1-K0…
G
Thanks for getting a guy who sounds like the terminator to warn me about AI…
ytc_UgxH7ktJw…
G
I don’t believe it will take a century to see robots as intelligent as humans. W…
ytc_Ugyb7pamQ…
G
I have an idea, what if we just don't post any art for anyone to steal, yes it w…
ytc_UgyFP1UiG…
G
Lavender, you may not want to read any of this—it's just noise that you don't ne…
ytc_Ugx9mfyQZ…
G
Much more likely that this is people in “AI jobs” who get laid off because there…
rdc_oi2g7yb
G
Gotta be honest, that entire post sounds like one dude in the NaNoWriMo marketin…
ytc_UgynzsSId…
Comment
That is fucking haunting. I don't doubt that these are merely the opening salvos of what will be a rapidly developing field of AI ethics, but it's hard to look at this and think it's remotely safe in the hands of certain categories of people, especially the emotionally disturbed.
youtube
AI Harm Incident
2025-11-08T07:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwhuSzkuOeqTPkemN14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXIVf1bLRG77MwnUx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyJZCjEp0ZPiz0e9fx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyEMfx_-Avxy4wly7B4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyDJDYsokgoJQUN19h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwlxsHdVAtb7wx6raV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw-jm2agfza8FWdHgh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzuZVrovRkSCdqaDzN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzh_RC7nUhRUTHU44p4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw625q1beE1Z5UPLvJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}
]