Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lmao get real by 2030 we won't even have robots mainstream. To get rid of all jo…
ytc_UgyjI45D6…
G
Sorry to disappoint you with the pace . But the real risk is if for even convers…
ytr_UgzUoEgi3…
G
I am disabled, and art is one of the only things I can do
I’m sure in the past I…
ytc_UgwyWhWQj…
G
This is silly. Common sense says instead of thumbing your nose at using AI to ge…
ytc_UgzMHKXe2…
G
If this is not some AI made shit, then is so disturbing and dangerous on many le…
ytc_Ugx__l1bt…
G
I don't want to be doomeric maybe I do, but recently a book author tried to sue …
ytc_UgxBM3ry0…
G
That makes a lot of sense. Eliezer doesn't have a degree, doesn't even have a hi…
ytr_UgznWUVUo…
G
Actually, AI might be the last chance for the biosphere on Earth to stop mass ex…
ytc_Ugzjf2XSH…
Comment
I have to laugh, yet it’s a defeated, pathetic attempt at mockery of the arterial line in the subject ~ control of the human made Artificial Intelligence so it doesn’t get scarier, more violent than is known at this date..say just for the record, today September 2025…
Remember years ago “we” were concerned about the overuse of violence in movies, even children’s cartoons. That concern didn’t curb violence, it exacerbated violence in all forms of entertainment. Good luck trying to tame a wild intelligence ~ especially since it’s far away from the coral and not programmed to diminish or come home with its tail wagging behind it.
Violence is the best way to lose weight and to accentuate hero’s
criminal or otherwise…
youtube
AI Harm Incident
2025-09-28T16:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwCEvt0HUL8K9FOq4F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgykfPQafn4Ot95khn14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyWdVp03a5TfpinN6J4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwbysMCnB_3rzd2XNR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQd6gqF3yEaFVeJKd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxQdKev4ic_kU1IWdd4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy7Ain5h3XBRQdSpZB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgySt_vXZu7FPkbWxYR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyYY5AFx3hWmJ40RGt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzoK9N99EXE6WhI2uN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]