Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If you don't watch where you're driving, you'll cause an accident. What are you doing playing with your phone while driving, anyway? If someone deliberately neglects their responsibility as a driver, it is not fair to shift the blame onto the system. You should not expect Enhanced Autopilot to always stop your vehicle in the event of potential danger. It is a supportive tool, not a safety guarantee. You remain the driver and are ultimately responsible. Whether it's a Tesla, Volvo, BMW, or Ford—as long as the system is Level 2 (such as Enhanced Autopilot), you shouldn't expect it to autonomously recognize danger and always stop. You remain responsible for monitoring and intervening. Because Tesla's manual explicitly states that the driver must remain alert and intervene, this supports the argument that a driver who is distracted (e.g., looking at their phone) bears primary responsibility. At the same time, the manual also shows that Tesla recognizes that the system has limitations — meaning that the system does not automatically cover all risks. It is illogical and journalistically inaccurate to use an accident involving Enhanced Autopilot to label Full Self-Driving as dangerous, and it is unfair to single out Tesla as “the dangerous one” when other brands use similar systems with the same limitations.
youtube AI Harm Incident 2025-11-01T10:1… ♥ 2
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugwu4rHzVpWCrLX-PDx4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugx7O0cKCI7CBztphUh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwBIz3LLF1v3ot_cx54AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxLtdt4jgYA00Lpfr14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx7OWanwO7ttJ-l1wJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxKT6NMx2d-xvzoKiJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxKaEF8mBH_TNOXPFJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyzyXIx-HILl2sh6ip4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwvDSlxL2FHVLLctiJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwmFboZijQ43Q3MEeJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]