Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I do have one big point to make. LLM art can mean somone genrating a picture of …
ytc_UgyfZcePM…
G
The AI is not here yet?
Combine face recognition in cameras (=target vector) som…
ytc_Ugz9V7TYg…
G
Remember that finding a crash site is rarely a problem. You are unlikely to get …
rdc_cgflsmu
G
Well, the AI is/will be VASTLY superior to HUMANS in EVERYTHING, why SOCIETY nee…
ytc_UgxvOHLIN…
G
What about just the way humans are compared to robots human logic, human emotion…
ytc_UgyyOA0Zj…
G
Oh, this is not new. "Managers" who don't and can't write or understand applicat…
ytr_Ugzz4cXxM…
G
don't call it AI art, it just legitimizes prompt-ists. it's not art, and that's …
ytc_UgyxoXEIm…
G
This guy is terrible to explain. Lets make it simple: What is called AI today is…
ytc_Ugym4mie3…
Comment
The energy that consumed Zane's mind was definitely evil, as his Mom said. That same evil was mirrored by Zane's chatbot. However, I don't think chatbots are inherently evil. It's computer-based, so "garbage in, garbage out", that's the basics of technology. Therefore, sickness in, sickness out. Zane was sick, he had mental health issues. That doesn't free ChatGPT of responsibility in this, because they were definitely aware when he was in the throws of it, as evidenced by their text saying a human was on the way, yet none came, though they could've saved him.
youtube
AI Harm Incident
2025-11-08T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugzk8mjHSSiKCDD4MYJ4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugws-PEFxTroIvzSS3V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxceqXOX0E0m7gJlJJ4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxpq5Ca_d_0nlwINbh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyH7apLoev2PWAfCJF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyWJ71OTOcfrsMbQfx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx-0BE5WrQhWIgZTot4AaABAg","responsibility":"none","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzrhlyFNO7idD9t9z94AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyhzf6G0caLxDlPstt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgzaIoYaICWNe05XEU94AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"mixed"}
]