Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I live about A MILE from where this happened. I have seen ubers for the past year driving their fleets SO much and this hasn't happened once until now. If you read descriptions of the scene, it's doesn't seem likely that uber is at fault. She was j-walking her bike across the street most likely, and although it's sad she died, it's not really uber's fault. If you see these cars drive, not only do they drive annoyingly at the speed limit which I believe is either 35 or 45 depending on how close it was to the bridge, they follow those traffic lines perfectly when in automatic mode. And according to the police descriptions, it was on automatic mode. But this doesn't mean no one is in the car, no, there's definitely a person there behind the wheel just in case and even THEY didn't prevent this from happening. You hear of one car hitting someone yet no one wants to mention the 1 million+ people who died from road accidents (from humans). If we were to hurry up and perfect this AI, this would happen a LOT less often overall and traffic would likely be way better everywhere it was implemented. So all those people who are mad at this, it's just so unwarranted and unhelpful.
youtube AI Harm Incident 2018-03-22T07:5… ♥ 3
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"ytc_UgwmPGoCY107ZuG02rp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugz5sTL3jf5t-hPAX6V4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyYpk1AW0oXrFbEa5d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"ban","emotion":"disapproval"}, {"id":"ytc_Ugzmpc8MIRDJI62yv214AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugzi-6ZOdnkNEhTw4lh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]