Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The problem here not a failure of AI per se. AI is only as good as the data you …
ytc_Ugz6kNrE6…
G
And people keep defending AI.
Even if the problem is generally greater than AI,…
ytc_UgwmblAxB…
G
the article says the woman was struck *immediately* after stepping off the sidew…
ytc_Ugyp1kPe1…
G
😡Old examples of data, accidents and vehicles ability to date. Examples YEARS OL…
ytc_UgwlVK2rw…
G
Bruh, we all know what that ring is for
Whatever scientist has made that robot…
ytc_UgwunzWdH…
G
That’s the spirit! Let’s enjoy this life we all have. AI cannot take away my w…
ytr_UgwMlm7Wm…
G
Reminder that Ai doesn't do anything but repeat our behaviour, if you present th…
ytc_Ugw7EqyWF…
G
so when will the companies determined to run humanity over the cliff edge work o…
ytc_UgyRHswWb…
Comment
Should it come as any surprise? I mean among other things, they train AI using media propaganda and pathological subreddit community dialogs. Whoever thought that was a good idea deserves to be first in line for elimination when AI decides humanity's fate.
youtube
AI Harm Incident
2025-07-28T13:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgwS0UXTUu3OCr2h7u14AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy7lZ8NSe05ufu08CR4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxo8IE_eOizaZxU7p94AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxxN3ZzMLXX1VKoCRR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugyd-S-p-CaTO3bST7p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugws29neEky5h8y5r4F4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw27OER8oCYXs8Kbe54AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzhRZf026xW5FjFZrN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgziYmCfpM6ZtBll2vd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxSZqfcSrKSZGkl0-d4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"approval"}
]