Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
“Hey folks, as if driver vigilance wasn’t enough of an issue normally, let’s tell folks they can kinda, sorta, sometimes, hopefully most times, but suddenly no times take their full attention off piloting the vehicle at 70 mph.” I guess “Autopilot Sometimes” didn’t score as high with the focus groups. There’s no doubt Tesla is responsible for an amazing number of automotive advancements. But the idea of putting a semi autonomous system into a car is the worst human factors trap you can imagine. People want to believe in the tech– who wouldn’t. But the idea that you still need to be vigilant, runs completely against what people imagine automation is all about. As a result, people are lulled into a false sense of security. Specially something called Autopilot, a term that was specifically adopted to bring to mind the complexity of an airliner navigating itself thousands of miles and many hours without active pilot inputs. Couple that with the carnival clown at the top of that company, spouting misinformation about the technology plus cutting corners on situational awareness systems and we’ve got the perfect storm that will and has killed people. Oh and don’t get me started on the Drive By IPad frenzy that has taken the EV world by storm. I’d love to see data on driver distraction events in cars like this with no tactile, eyes-off controls.
youtube AI Harm Incident 2024-12-19T15:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwRzV2LX1QZzc-wZbd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwuc9vf3Jykh_Eh5vJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy60AjfVIrwzWH03L54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzQpLY1Bn8ksDR2f1N4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugx-mZ4GdH4_e5--bQx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugy_Md68ksjRgGYz5EN4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwGRNdPmLJN5KpYXLh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxxtA32pQjGnY2uIix4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxelEBUXWd0_tb9QLx4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxfx527WSxcGA8GYpV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]