Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Oh yes, AI just good about giving you its top answer, but just like a dictionary…
ytc_UgxCBxW-r…
G
Just watched a video talking about AI has learned to lie all by itself, work aro…
ytc_Ugw1eLD8_…
G
The fundamental problem here is that Autopilot is a single entity. All mistakes…
ytc_UgzqCU3iv…
G
I use LLMs for therapy since I lost health insurance, but only local ones. I lov…
ytc_UgxyPJCj5…
G
I asked ChatGPT to generate a story and it generated a story about a family. The…
ytc_Ugz858Jro…
G
I draw quite a lot and I can say I'm good, and gotta say something
Fuck AI It d…
ytc_UgyWC0f1k…
G
"Privacy rights against AI" is basically asking for privacy rights against the e…
ytc_UgwC4Odtv…
G
Edward Snowden warned us about the intentions of our government wanting to keep …
ytc_UgzpRQl_I…
Comment
I don’t mean to be that person, but I really don’t blame the AI for this. I feel like we used the AI and the fact that he asked it as a scapegoat for the fact that this man was already stupid and he did the same crap we used to do and sometimes still do with regular websites. From what I understand, you’re saying that the AI told him that the bromide thing was only for cleaning, but he did not pay attention and he thought that it would work on his body so that’s his fault not the AI’s. The fact that you made it seem like asking the AI was the reason he went through this is a classic example of shooting the messenger rather than blaming the fact that the message was not received as intended.
youtube
AI Harm Incident
2025-11-28T23:0…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgxFnCCq4cTYdRL2IzZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_UgzDGiBp1v6Ng3XuslR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"ytc_Ugyh67dHDqK-KUD8pl94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},{"id":"ytc_UgyxEB2itQmI7-nx8ZB4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},{"id":"ytc_UgxuJo0n8bAV8KC96uR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"},{"id":"ytc_UgyK0HcGvXezlkL24kt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},{"id":"ytc_UgyWVupV7nLop7sSOPF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyq8skumUefe5Z8nIh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_UgyeU_vneFJKC8v04Bl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},{"id":"ytc_Ugw_gTKFR8CI5N2WcHp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}]