Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is a hit piece without any nuance about it. Missy Cummings has been very vocal about Tesla vision only system… while being on the board of a company especially focused on LIDAR systems. She has been removed from NHTSA advisory role on Tesla’s cases in 2022 because of her (obvious) conflict of interest. First a definition of what Autopilot really is, as there is a lot of confusion between Autopilot and Tesla Full Self Driving or FSD for short. Autopilot is available since 2014 and is a level 2 advanced driver-assistance system (ADAS). When you first activate Autopilot, the instructions are very clear: it is meant to be used on highways only and you have to be able to take over control of the car at any time. You have to keep your hand on the wheel and pay attention to the road. Autopilot is available with every Tesla sold around the world. FSD was first made available in 2021 under the label “FSD beta” now aptly renamed “FSD supervised’. Currently it is also a level 2 ADAS and the driver is still responsible and must take over if needed. This system is much more capable than Autopilot and isn’t limited to highways. We are currently at a point where, most of the time, it can take you to your destination without any interaction beside giving the destination. Once FSD reaches the point where supervision isn’t required anymore, the car will be considered a level 5 autonomous vehicle and the driver will now be considered to be a passenger. FSD is a paid upgrade and “FSD supervised” is currently only available in North America, with potential expansions to China and Europe next year. Now back to the issue. The video make it as if Tesla cars where death traps when you activate the Autopilot, but even in the exemples given some of the accidents occurred outside of highways, places where the autopilot should not have been activated. Numerous inquiries have been opened but NHTSA after accidents potentially involving Autopilot. None of them concluded to a dangerous flaw in Tesla’s systems, to my knowledge all investigations were closed without anything substantial to report. Don’t forget, unlike any other car, Tesla cars have the equivalent of a black box where everything that happened to the car is stored, not just videos. All of this data, while not available to the public, has been made available to the investigators. Tesla cars have been rated as the safest cars on the road available by the NHTSA equivalents all around the world. These ratings includes the ADAS of which Autopilot is a part. Autopilot is also available in Europe where the rules are much stricter here, if there were any doubt that it was sometimes dangerous, it would have been immediately disabled. In conclusion, even when used according to specifications, Autopilot clearly isn’t perfect, but if you pay attention to the road and take over when something unusual happens, it is safer than driving on your own.
youtube AI Harm Incident 2024-12-16T14:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugymxs7I-T47d-2EPgl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxS8QN44BNGQCNwYK14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz5A5m2klmeRivyB5d4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwS_LepYfJbvFnKSMR4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgzebMplL1mA9E_4LPF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugz8rez9_oL_bu9knnV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzS8ooHOA5c2aTeTiF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzfLUOfQuTAP5UQmkF4AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgwiKaMdsF4mK-Aom514AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugydqn1QFkeeyNgxTd94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]