Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Not sure when this occured, but I'm willing to bet that it happened before the software stack 'merge'. Before, there was highway autopilot, and city autopilot. City Autopilot (FSD) is designed to recognize stopped objects anywhere in it's vision, and avoid. The old highway autopilot software stack was not. It used older algorithms to focus on moving vehicles around and the land in front. Hence, the difficulty in recognizing vehicles halfway into a lane. It expects people to be paying attention, like they are supposed to. Since then, Tesla has merged the two software stacks into one main FSD stack, making the vehicles MUCH better and seeing, recognizing, and avoiding abnormal situations on the interstate. It also recognizes flashing lights and immediately slows the vehicle, displays a warning on the screen and won't continue at prior speed until the driver instructs it to by pressing on the accelerator. I appreciate that the NHTSA is investigating semi-autonomous crashes like this, but realistically, they move way too slow. By the time they release their findings and recommendations, Tesla would have long ago recognized the underlying issues, rewrote the software, and uploaded it to all Tesla's on the road. The most important takeaway from all this is that, especially right now, if you are using any semi-autonomous driving car, you need to pay attention. It's not perfect...yet. While we keep finding edge cases where software needs to be tweaked, the cars will get better and better at driving. And personally, even right now, I would rather be on a highway full of Teslas in autonomous mode than be around the vast majority of drivers on the road.
youtube AI Harm Incident 2023-08-10T00:0… ♥ 8
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw00L5l_lloQtZc43B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwzFrHyxzhYTM6S8yp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugws1jZI893bMhG_Kbl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx05WZaYxHDPze36m94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"resignation"}, {"id":"ytc_Ugxn_zfQQ7J57Hd_m8J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx3kaV7yWQiDojdtjZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxd6oL9cXZr4qz54p94AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzsAxbDsGhHeDAng5t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxBnWcUisJu4A8N1rV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwQOszPHHmhWhMAzdV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"} ]