Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Autopilot totally contributes to accidents in two ways - first by being blind to hazards (such as above), plowing straight into things and second by allowing drivers to become unattentive to the road and conditions. It is human nature that if a car has an "autopilot" or "full self drive" they become distracted, they play on their phones, they look around, they stop looking ahead because the car is driving them right? Wrong. And Tesla has a minimal lame method of requiring the driver to demonstrate attention - hold the wheel. Not only does it not ensure attention but it is is easy to subvert and completely broken in some online clips (i.e. people take their hands off). So yes this technology kills people. But Tesla can't sell cars without a kewl, broken feature with a misleading name. A proper system would monitor a person's face for attentiveness and disengage if it wasn't there. But not Tesla. And Tesla also disengages the autopilot a split second before the crash to pretend it wasn't on at the time, to mislead the public when crashes happen. And it will only get worse when their "robotaxi" appears. Passengers will get to enjoy plowing into hazards without any chance of stopping it because there will be no wheel or brakes. I'm sure they'll still find ways to blame passengers or they'll shove the liability onto operators foolish enough to buy one of these taxis under the delusion they'll make money.
youtube AI Harm Incident 2025-05-13T20:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx0L1qHglX3pK9gPYd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyz4uJIB8QA94SJXJh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugy7UTlST1n9Yn0_Dnl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwUgmr3H29k7l5eOFB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxsMR8mTA-DjIx5ilJ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugyb7iz4khmOa1kskUZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxIUrI3kcZaz93ppHp4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugw8cUF7UZR4JVo9ry94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyjhfHKMc210FQq_yl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugzil9Bn_RkffBN3O-d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]