Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
@ first of all, I was being nice, I never tried to insult anyone, I’m an artist …
ytr_UgxT0rhgY…
G
So if one robot learns how to harm humans it goes to the cloud (AI consciousness…
ytc_Ugx27Mhft…
G
This is a sad reality but I see no other option but to accept it. AI will never …
ytc_UgzREamaL…
G
AI doesn’t know what kind of skin goes where either. They end up putting hand sk…
ytc_UgxCN3dly…
G
But sadly we need AI to solve our evolution bottleneck. We need AI to unlock the…
ytc_UgyzWGMAv…
G
There's a strong, pervasive hatred of women throughout these types of men that i…
ytc_UgzybaGaJ…
G
i certainly create art every day idk about you. i don't use ai to do it though…
ytr_UgycFNTGh…
G
> I don't see how this is sustainable unless they can convince people to star…
rdc_l4bx8wf
Comment
EVERY SINGLE PERSON before AI became what it is now FUCKING TOLD US this would happen. Every bit of movies depicting AI (Terminator, 2001: A Space Odyssey, etc.) PREDICTED this would fucking happen. And like every prediction, we were concerned with the "Could you?" rather than the "Should you?". We're fucking cooked.
youtube
AI Harm Incident
2025-09-10T22:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxh1xi56SMv6u97zz14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxleEaSbK4lECNIU0V4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugxcb9YouaPiukCrz014AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzdB475xcm4XTKd1Fd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzmdt4L9nN-vU-VazV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzUF9TqSbXZYTx1Bgl4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzqPEZrGSk3KjjTgTB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxTH3Z5tjfzLwhCOAJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw-uQu20wa-9XCc-8t4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugyad0zFCG4DNz6WvkJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]