Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Lawyers and judges are using AI because they think they can because they said th…
ytc_UgxNlUGc3…
G
Honestly I had no idea most shows nowadays WEREN'T written by AI. They're so fo…
ytc_UgyR96NJQ…
G
Robot and guns are to words that should never be used in a sentence!!!! 💀…
ytc_UgznzpEX0…
G
Thanks for your comment, @user-wk1sf6rj4t! So, are you suggesting that the robot…
ytr_UgzaTeu9H…
G
Haha, that would be a fun upgrade! While Sophia might not wash cars just yet, sh…
ytr_Ugy2VQKF2…
G
No one in the world call themselves artists when using ai, just sybau and stop a…
ytc_UgzLBFjlX…
G
This is exactly the way forward-looking people think about AI. Stuff like this a…
ytc_UgwwCO9_m…
G
Listen I can't draw worth shit and I know I don't have the time to sit down and …
ytc_UgwgF2Nfs…
Comment
If the roads are full of self driving cars, then all accidents can be avoided by proper driving algorithms. Why is the example self driving car riding a semi truck's rear? That's passenger death due to improper algorithm. Same with 99% any other driving scenario.
youtube
AI Harm Incident
2018-10-24T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxNu6orq72VYmfHfwB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzZh_afhC_OOFGLQXR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyCTPsECAP96PGh3Hl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgxJKKQ9sqMp_Ti81H54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxqNM8gW2hHoyqHe5l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxnIUdclFIQ-4Rv2z54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzlBBbufEX3_0ASe354AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwCGqFtlXMJ6s8DwaZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwtXdfQuJN50FtVCfZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyosQqVFTeUat0GwLx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]