Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Actually, all cameras that are in view see it, it's just that the software needs to translate what all of them are seeing, depending on the angle as well. Basically, it's the software, not the camera that's not seeing it, since it's not trained on overturned vehicles, then it doesn't know how to process the image of what the cameras are feeding it. I've worked with 16 camera systems with LiDAR and RADAR that are pointing in the correct direction, with a bus annotated in front of it passing by, but somehow the software thinks it's flying 50 feet off the ground, so of course it's going to try to ram it, since it thinks it's not there. Heck, it even doesn't know what to do with a wall, since it's not really trained to analyze it either. If your self-driving car somehow goes over the sidewalk and you're not on the wheel, you're screwed. It's just going to ram itself. This is not limited to Teslas, this holds true for all self-driving vehicles at the moment. Stop making it sound like the cameras are the only ones to blame, they are just the first and last line of defense. LiDAR can only assist from a distance to some extent, but anything that gets too close also ends up as garbled data. If say you're on a bicycle and on the side of a LiDAR equipped vehicle, 2 feet away from it, you're in all likelihood a garbled mess for the computer to analyze, and you can still get sideswiped. This technology is not ready for mainstream. Not at all. I doubt it will even be ready in the 30s, but hey we need to sell the masses some new hype or buzzword to throw their money at. Who cares if they die as long as we make more money, right? Also, pilots are properly trained to fly a plane and use autopilot when needed, regular drivers just need to pass the driving test to get that license. Regular people will just buy into the marketing BS of autopilot that's fed to them because most people aren't that bright, let's be honest here. A smart and informed person will probably not trust any technology 100%, that's why we have backups, contingencies and redundancies, cyclical error correction, etc. everywhere, but most people are not very smart or well informed, unfortunately. That's just how it is.
youtube AI Harm Incident 2024-12-14T02:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxdmhhe2xJy38HHznJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyFZdXyfXOsO7ZwS2N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzWKi7yOLCXXhIjoYl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzj1gf7aLSQPCHrLE14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwRC2hQ9Ro2054MSoR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyPLsMmoebbolnGKoF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzRAyEwWX46oCbrWNR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugye6sm5CimNgUuqd6N4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgySqzulxczhECGzJZB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwR6MlY7-XSeYc0CHt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]