Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yeah, i dont get the hate either. Like the human brain does the same thing. It s…
ytc_UgwE1Nfdr…
G
after watching this whole video, the way it was put as cyber punch heros/villian…
ytc_Ugzb2URYt…
G
accidents will happen. autonomous vehicles have a lot of catching up to do in te…
ytr_UgzpHpTcs…
G
I hate AI so much, we're getting closer ro "I HAVE NO MOUTH AND I MUST SCREAM." …
ytc_UgwBToJXO…
G
So like is every coder and 20 something wanting to get rich buying into the lie …
ytc_UgyUZ2AFX…
G
A.I. used in GAZA....that's what Dumb Apes use it for...pray we never leave Eart…
ytc_Ugx7BJC_k…
G
SORRY LONG COMMENT - (I am an instant fan and just wanted to get out the gushin…
ytc_Ugyj7hjGh…
G
perfect excample of ai dangers, a scientist talking about politics, on line hes …
ytc_Ugw-6pzRp…
Comment
Writing code has consequences. Computers and AI can’t take responsibility for their actions, only humans can. Everyone warned the AI developers this would happen for decades but greed prevailed and now we pay the consequences. This needs to be a criminal case, the penalties need to be applied to the administration and programmers and engineers. Jail time is required, not just financial penalties. AI was functioning as an unlicensed doctor and there is precedent for this. If we let Google & ChatGPT buy their way out of this the death toll will be larger next time.
youtube
AI Harm Incident
2025-11-11T18:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxSMqDHGv3Pz5dcDxB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxzFpFjzynviAq3EZB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_sJvyvWMtMDDsACh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugy5laI6It8b6giW1SR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9878G7EOL-z_3O1F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyGRhroKcryPGMRSWt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"frustration"},
{"id":"ytc_Ugy4wwriH4FABq8j2dp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"disapproval"},
{"id":"ytc_UgyzIS_Fwhq9Kh-6wal4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxNKvP7RMkiJZ4pKht4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx6wj13W7-NS2QYoIZ4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]