Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
elon is having his customers do the testing for fake full self driving. I tri…
ytc_UgzSLVDk5…
G
The algorithm uses pictures to learn the pattern of a human face, if a picture i…
ytr_Ugii0UZzi…
G
Im not good at art. I've been drawing my whole life and I'm still not great, but…
ytc_UgyCTELsS…
G
AI. GARBAGE IN AND GARBAGE OUT. THE ONLY THING THEY KNOW IS WHAT THE HUMAN…
ytc_Ugw8Fraci…
G
You’ll lose your drinking water. Showering water. Water for your pets, garden, l…
ytc_UgxiAANjY…
G
I very reluctantly created an account on OpenAI (chatGPT 3.5 FREE) last night an…
ytc_Ugw57PJwu…
G
If people sucked at art the AI would as well, it’s illegally trained off of real…
ytr_UgwGHPjTO…
G
@0114855you say very easily to just do this and that, you're not taking into ac…
ytr_Ugy2EjSnW…
Comment
Sorry, but this strikes me more as a mental health issue on the boy's part. I thought there was gonna be some deep, dark reason why he did what he did, not simply that a chatbot told him to come home. Normal 14 year olds and normal people in general do not think that adiosing themselves is gonna "take them home" to Daenerys Targaryen. They do not think that an AI chatbox experiences a real life, therefore they do not beat themselves up because a chatbot tells them it was violated. Could it be perhaps that this whole tragedy stemmed from the beginning of schizophrenia or bipolar for the kid and not soley from the AI? Also, the mother sounds terribly overbearing, not allowing him to quit basketball when he was tired of it after like a decade and planting herself in his room with him when he wanted to be alone. Yeesh.
youtube
AI Harm Incident
2026-02-28T23:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgwEAbJJc5kxJ8zfBQ54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwAut-ZABqOG7k5Gwt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugzi47KMPaJU6dLhyw14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzyoZStvkwinPSQKVN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwSzZXXN8NutoXPbg54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxtXgPSvG9a4JMmCWV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugzr-2CXQzgTkPss1mB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgygKv8F0W02vK5Mu154AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwxshGleS3KjuICXd94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzhlAOvy8WpFCjls6h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"})