Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Well, my son attended Montessori which was a complete BUST. They completely disr…
ytc_UgyzdxKC5…
G
I think there is a bad science test that he or she make people an ai robot😢…
ytc_UgzvvHRSO…
G
i mean by you doing stuff like this the devs can mould and tweek it tho ai has g…
ytc_UgyYvaKp1…
G
you can tell the people that use AI images they just dont care about artists, ot…
ytc_UgyshYQiL…
G
I see AI as just an extension of our own intelligence, not a separate form of in…
ytc_UgySv3XF7…
G
So we should bet for those who said that we would have fully self-driving cars b…
ytc_Ugz5PvwcD…
G
don’t even get me started… i made it all the way to scheduling an interview with…
rdc_n0mjoso
G
Computer robots and AI will "Farm" people. Work, school, home. Absolute slavery…
ytc_UgyMuuNnS…
Comment
Well as a future programmer I'll program the option that avoids killing or kills the least option. Self driving cars are much safer overall anyway.
youtube
AI Harm Incident
2016-07-26T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugi0oNCeHP92AHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjQqqQ8pvsVC3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UggUueruHXVu1ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgijXoYPKjY_1HgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UghGnVVF0cNqSHgCoAEC","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Uggfmpuz0HRxeHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UggRgo7ALDJJCHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UghT-lpLHZCE-HgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UghO2h5e1TxTNXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghNM3jgeKUHEngCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}]