Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I dont understand why they are so obsessed with making "AI" that replaces human …
ytc_UgxoC4IXf…
G
am on the fence with all type of robots and ai but can you answer what would ha…
ytc_Ugz62lgq0…
G
If no possible way to install a local LLM, how would you go about this? I use it…
rdc_o05grm4
G
A few of us online yesterday tried to convince a few people that a crime video w…
ytr_UgwRmG5dx…
G
And spotting deep fakes is a problem for computer scientists to easily debunk, I…
ytr_UgxMEMYqG…
G
The fact that no one can actually match the quality kinda angers me, it's unfair…
ytc_UgwgMEGih…
G
AI can prescribe AI engineered medicines for AI diagnosed cellular failure. This…
ytc_UgzCGMRq-…
G
Yes autonomous and platooning trucks are here, but the driving public and govern…
ytc_Ugh6k2OP7…
Comment
If this car gets hacked, you could seriously be kidnapped. Even if it isn't hacked, there should be an AI assistant if the car gets lost so you can provide directions, and there should be a stop button so you can get off when needed. I still would prefer to have conversations with a real taxi driver anyway...it's part of what makes a good travel experience.
youtube
AI Harm Incident
2025-12-11T22:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugy4dw_LzEYjDaszAy94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzu99oWvD-C4Yq-JrF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyBJR_QfAzOL68oTXh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxxEM58A-queajnWjZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwJQHlD2jo_QZ3wzi14AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz39XFycav6_r-v-uN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwvHfdXK39vLgr6Y1J4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugyv24XLkX22KfQ9Iat4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyrbQE2UMmUG5CrWQt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKr6LKdWFCJOKWwcp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}
]