Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
BOOOO. This entire question is moot. Know why? "Your car doesn't have time to stop" is a stupid condition. Self-driving cars will be programmed to drive at a safe distance from any car. *If something can fall off the car in front of you and you can't slow down to avoid it, you are following too close.* *PERIOD.* Also, if you're the car behind the driverless car, if they slam on their brakes and you hit them, you are following too close/aren't paying attention too. You should both be able to slam on your brakes and slow down together (with you a second or so behind) and you should not hit them. That's the rules of TODAY'S roads, let alone with driverless cars! Thankfully, these kinds of armchair thinking distracted drivers won't be an issue when most cars are driverless. When that happens, an entire lane of cars can slam on their brakes and no one will have to hit each other, because they'll all be driving safely--by the computer.
youtube AI Harm Incident 2017-06-15T17:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningconsequentialist
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UggttszQdOIT0XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugjeinq77JWQDngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugi9KaGK7Pz36HgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjT75c_hYfYXngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UghemGmrqvMpTHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugg9VdcEV0AJu3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgiMecIFIV9nLHgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugio-4_UVi4xP3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugh2ii7t431fIXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UggBZZ06kKai4ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]