Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It understands exactly what it's doing. I spent 15 years analyzing and researchi…
ytc_UgxJUP6OA…
G
CutTheKnot they already have A.I. guided missiles the military has black project…
ytr_UgxLXuqUx…
G
Just another failed system that will collect thousands of taxes. Before the 21st…
ytc_Ugy6I3Vuz…
G
What's going on has made me realize stocks are such a joke. Value based on the s…
ytc_UgwSlxOR7…
G
I'm not a truck driver. I'm an ordinary citizen. I feel unsafe. That truck does …
ytr_UgxwnVSng…
G
God, these arguments are stupid and these defenders are just making themselves l…
ytc_UgwZ3Fda6…
G
Ai will not take over the world. Maybe in a 1000 years when it actually works i…
ytc_UgysO94HS…
G
Not sure if they were second poorest, but they were most definitely among the ~~…
rdc_fnxqepm
Comment
Okay, there's some massive, MASSIVE ethical implications that folks are just handwaving when it comes to AI (and other related tech). That issue is that there is no precedent for dealing with a liable AI, and it's not just law, humans have a natural, intuitive understanding of liability that is entirely incompatible with whatever AI is. It's not human, so you can't treat it as such, but it's not natural either.
youtube
AI Harm Incident
2025-11-25T04:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | contractualist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgwM8Rf2bzAk21_X9D54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzCO37ZjvwkvkveZ0J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwV3qKWo_l4c8N_RfB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugyd2TXCC3rqtE9EsLR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzmJUnn8IvxoWAI4FJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyzKsvnGidaaYX036d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugzs8bW_8EMCwD26mdJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwwdvtPskdAIdsM6m14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgyJxC8A_azgz32WgYd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzpdGnq5Hm6kJDr-QJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]