Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To get the closest possible thing to your vision or requirements from AI, it doe…
ytc_UgzwXy19K…
G
I'm learning AI to work, cos it's what many clients ask. I don't like it, it's e…
ytc_UgwWYST3p…
G
OK, go ahead and give me an example of corporate America shaving off the expense…
ytc_Ugz_1yaw-…
G
How would AI acquire a true morality? They lack conscience and making moral deci…
ytr_Ugzvi5Azd…
G
This is false. And it's this person's projection onto AI, not AI itself. That me…
ytc_UgyAAvlf3…
G
AI so far has only brought inhumane harmful results. Not in the best interests o…
ytc_UgwUB83k_…
G
I used ai to make a simple connection monitoring script. The basic approach was…
ytc_Ugy2IdgAN…
G
… yet another ivy league pointy head academic theorist with zero real world busi…
ytc_UgzelkCB2…
Comment
Nah fam, because I was reading a web novel and the protagonist has as backstory that when he was in highschool (and later his then girlfriend, which died) got into an accident thanks to a self driving vehicle that went crazy 🗿
Spoiler: Even after decades he feels extreme guilt because he was the first victim and he didn't warn her to not trust self driving taxis.
youtube
AI Harm Incident
2025-01-21T02:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_UgyotX78CACjFjarVdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugyev-4i_EzisCpnLYF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugxoyf206KxI3BQvnkl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxWu0nt4neZW47YlPt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwgfHIWHC8hWV_McCh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyg5TVeTtLY3HWNUAV4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxnAB4ZOJieP9Ynl0p4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx1TWqDVFUeqK3kZXB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyGUqSV8eQPpFBn3Bl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgxdUAV-XLyRLlxu15h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]