Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You make many good points in this video that i absolutely agree with. However some argument are a bit too black and white. I am in the position of owning both a tesla model 3 2019 and a MT-07 2016 and have been following tesla for many years. I have to give a short version here as to tor spend an hour or more (and english is not my first language) The Lidar argument is referencing the argument that lidar is not scale-able and expensive, therefore when both lidar and camera based AI have matured, the results will be about equal in theory. (paraphrasing here) The disengaging of autopilot one second before impact is both dumb and disgusting! Calling the not self-driving mode "autopilot" is worthy of criticism indeed, as my impression of the average motorist / car driver is not favorable. Having a name to this mode that makes people think "i can relax because my car drives itself" is a very bad idea. On the point of not having two sources (radar and camera) i get your point, but i do disagree. I would defend this decision more strongly from tesla IF THAY HAD a name that did not imply that the car was driving itself. But at the same time, you are reminded on the website before buying and on multiple occasions after, that you are the driver and have full responsibility of the car's operation and handling. No accidents would happen if drivers would pay attention all the time and drive responsibly... Its unfortunate that all humans make errors. In summary i would like to see changes to the name of the autopilot, stronger wording and warnings about "autopilot" and the limited capabilities it has (and remind drivers that they are driving the car) and people to put down their f**king phones and pay attention. Still a good video with a lot of good criticism!
youtube AI Harm Incident 2022-09-03T20:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgyP2gT01fvnk7m4NBp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy09Pax8SAt5Kcc3Ut4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz8JWfitD7jCXoigh94AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzixXhx7_umaWo02Tt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzJQIYNEiLDApukNch4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzHCiHGDsFOTk8f91B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz36zUpe7cdSyjL7xt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzFi1J5aGlKEd8vYP14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzxOP-UykYOEhuJrL54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy1MWLhqkzpjwUMaTh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"})