Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I dont think its fair to blame Elon, nor Tesla for this. It was an AI problem, and that was inevitable in the long run of this. I do agree it was idiotic to remove radar, which could have accurately measured the distance, thus prevented those deaths. But there is another factor to this you never mentioned once.. Why was the HUMAN behind the wheels not braking? Because it was on autopilot, and he slept or watched youtube? So its okay to not blame the human driver for NOT seeing the motorcycle either? Doesnt that logically mean that this same driver would have hit the motorcycle if he was NOT driving a tesla as well, since clearly the driver isnt paying attention... The way Tesla autopilot works is that if the driver gives ANY input, it overrides the autopilot.. This means the driver also didnt see the motorcycle, and did NOT step on the brakes.... That is my point. Even if that driver wasnt driving a Tesla, the collision would still have happened. And if you dare to argue that "he should be able to trust the autopilot, he should be able to take a nap while on the highway", i dont know how to even have a sensible conversation with you...
youtube AI Harm Incident 2022-09-05T02:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz4VRuC36aj7sm4w0B4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw_nz5TwQRZLCKJe2B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyyTpYmWwrWh-Kf44R4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwHnexOjbvvU-CPeTx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwKS7jX8Rx6ZQ10j_F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzXkc48AgxXLaWeqRB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy1eSvFq_Wu3oGKhPZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwA-cuAlrYFrmRrmFt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyca6korHlUMd8ahtF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzl4f42nbYNUtEF8ex4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"} ]