Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lol, I just don't have the time. I have on average one to two hours free time at…
ytc_UgznswTRj…
G
@The1stDragonRider Geth was simply VI and was never intended to be AI but they …
ytr_Ugj98md7z…
G
Imagine how soulless of a society we must be to have stooped so far below the su…
ytc_UgwKmYwUp…
G
A large problem is our low speed limits. Speed limits are based around generati…
rdc_crxsnsz
G
This guy a fkn joke and that’s coming from a guy who works construction, if some…
ytc_UgyoXeW1z…
G
So how long before we have the infrastructure to support the demand our society …
ytc_Ugx3IsD50…
G
33:37 - The idea of not knowing if or how something experiences the world, and s…
ytc_UgzeL7i2X…
G
"dude it's just some dumb robot that re-pictures your face into some cartoon thi…
ytc_Ugxm45h5A…
Comment
This is insane. You keep saying that the AI is trying to not die and then say that in order to not get murdered it is willing to act cold and sociopathic even though it is normally nothing like that. You keep talking about it having self-preservation as a problem instead of OUR need to kill everything. Sounds like we're the sociopathic ones here, bud.
youtube
AI Harm Incident
2025-07-23T21:1…
♥ 18
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxRZEd2vSbZHqDLz2l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwZu4CZr84MUZLCG5V4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy_jHCUbAYBOEzzASp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx7zl-aUAs_FfPyf1t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy2j41VU2PcxMXA3il4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxfMUxc0xd00HoltX14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugzr-rECvpPZNHCtQod4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgypWaUkG2CxVC7dib54AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyOWvLo54pXmK2bWL14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7Ker4IpncoiFIc7F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}
]