Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is a bubble!!! Also here is an add that teaches you how to use AI. Negative h…
ytc_UgyKCoONb…
G
Trust me, I’ve tried learning how to draw, and I didn’t even know that existed t…
ytc_UgxzqTrmG…
G
My campus (California State University Fresno) is going all in on adopting AI on…
ytc_UgxWVQ2qI…
G
Yeah AI can't make a lighter that's mouth is the open lighter that has a tongue …
ytc_Ugz_SbkAJ…
G
Is this case of it saying sorry but not itself being capable of regret or sorrow…
ytc_Ugw0Memzl…
G
Since I'm old, I've already been laid off and replaced by AI, back when AI stood…
ytc_Ugwlf7e4P…
G
ok you win the moral argument.... but ai is still insane and everyone thinking t…
ytc_UgzbXNoqr…
G
No it’s not billions not caring about what’s going on … it’s total blackout of i…
ytc_Ugx2piLVQ…
Comment
Im sorry that she lost her son, thats awful. But this situation is far out there. Todays 14 year olds are not like 14 year olds in my day these kids are on it. Im not understanding how this intelligent kid as she describes him would detach from reality so far that he would take his life. Its strange. He knew it was AI that it wasnt real. I think he had other things going on in his head wether she wants to realize it or not. I mean he thought the bot lived in heaven ? He didnt know what happens when you die? But he knew how to kill himself. Again this is very sad and strange.
youtube
AI Harm Incident
2025-12-10T10:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzR0I6858w8X9c_ZKx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyKrJUAK9A6gwOZdcZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzRKZ4jR7GNUyhYLVV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwvFUUdLSaNAQ7WuG54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzsLWagSnwk9u8Tmdt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxBxbxC3wj7SuNGMAJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxtG5FAVOgx3InYBOR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwPAwxUCeeOv3a1AHZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugwqhuz6jzJ_z0hPli94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugw7Tefo4CvNyIb9QAJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]