Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
How safe are these self driving cars he asks. Well Elaine got killed by a self …
ytc_Ugx8tFGES…
G
AI is regardless completely wrong no matter the circumstances. It is an infinite…
ytr_Ugwu7DqEc…
G
If I'd meet an ai bro I'd yell get out to them in my best Tuco impression…
ytc_Ugz8OOlU7…
G
Time to return to school and learn trades. Like fixing the Ai computers when the…
ytc_UgzWQ8qet…
G
This is so fucked up. If you are going to make a vid about Ai then at the least …
ytc_UgwLi7tEv…
G
Before
Robot: Set up me first, than I will setup you
After
Robot: I'm built up, …
ytc_UgwC3S1b-…
G
Three and a half years later, and Enrico Coeira’s discussion on this topic is on…
ytc_UgwvlLRYC…
G
@GlamBoyAnt Ok... Now you need to realize that for someone like me who doesn't …
ytr_Ugxu2Ypz7…
Comment
I hate these self driving cars because it won't know how to handle human behavior. Let me explain it can drive perfectly to your location but it can't factor in how other people drive, there are people who run red lights for example.
youtube
AI Harm Incident
2025-12-11T03:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy4dw_LzEYjDaszAy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzu99oWvD-C4Yq-JrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyBJR_QfAzOL68oTXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxEM58A-queajnWjZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwJQHlD2jo_QZ3wzi14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz39XFycav6_r-v-uN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvHfdXK39vLgr6Y1J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyv24XLkX22KfQ9Iat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyrbQE2UMmUG5CrWQt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKr6LKdWFCJOKWwcp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]