Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It doesn't seem hard to me that whoever drives the car is liable for the accidents it has. For the sake of argument, IF Tesla WERE actually safer than human drivers and IF Elon HAD NOT lied about its effectiveness, you would still hold Tesla liable. You can have a self driving car that is safer than a human and then car company that maintains the software and keeps updating it for new road conditions, etc, pays for insurance. And you pay a subscription for the self driving (because that requires ongoing work) and along with that, it also covers those ongoing insurance costs so you don't need 3rd party liability insurance. Maybe you could still get 3rd party insurance to cover other things, like accidents that are not the vehicle's fault or other damage or your own driving when the autopilot is not engaged. But Tesla should insure for accidents during the self driving. Now on top of that, you also have just regular products liability because Elon has been lying about the capabilities. And here, you do also have a portion of the liability going to the human who was also negligent. The big figure was the punitive damages because Elon lied. The actual damages from the accident, while large because it was wrongful death, are not as big.
youtube AI Harm Incident 2025-08-15T18:4… ♥ 4
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw3SaUrZX2yOzQ_6Oh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxVHhzoJA5ZGTRNuXB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgysDxWZFwMGJvuStPt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwVm3rQ3D6BcwQpuwx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgyfVw9u8F179VQgM8J4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxv1KqFmv4nmdmGM5F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxVNOUcSMXW2HAejnZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyMPT61al517ReEEip4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxQ4DmOOAHfyLa3tvR4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyK1M9lzunLIii4XUd4AaABAg","responsibility":"company","reasoning":"unclear","policy":"regulate","emotion":"fear"} ]