Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Oh wow!! I just watched this and realized a couple of things... First, the title says "Footage SUGGESTS reasons" but doesn't say that WAS the reason... In fact, they even say the driver was alerted PRIOR to the accident and that the "although the system WORKED AS DESIGNED" it was "Not enough to sideline THE IMPAIRED DRIVER"... So, the driver was IMPAIRED, yet they didn't really make that a POINT other than to 'casually' mention that the system wasn't ENOUGH to sideline the driver... Which, means, the DRIVER hit the cars, not the autopilot system... Too many times you will see a car (any type) that hit that ONE lonely tree or telephone pole on the side of the road 'head on'. And you have to wonder "Didn't they have enough time to swerve out of the way?" - well, yeah, they probably did, but SOME people can get 'fixated' on the object(s) and head straight for them instead! This happens A LOT with emergency vehicles as well... The driver fixates on the LIGHTS and automatically heads right for them rather than around them... Too many times drivers (who survived) have said "The last thing I saw was the telephone pole" or "Last thing I remember was seeing a big tree in front of me"... This of course ends up in an accident that eventually is called 'distracted driving' or such... But in reality it's actually FOCUSED driving - but focusing on the OBJECT instead of the road and their actual driving... It's also why so many people will 'cross the lines' into another lane when they look at something on the side of the road... If it's on the right side, they drift right... On the left side and they drift left... While, had they just let go of the wheel (or REALLY paid attention to the wheel like they should), they would have gone straight and not drifted to one side or the other... It's a NATURAL phenomenon... Some people even 'lean' to one side as well... So, *I* suspect that the driver may not have even NOTICED the 'warning' on his screen and, instead just headed his car right into the 'shiny lights' OR that he saw the warning, grabbed the wheel 'slightly' like he was used to doing when getting the 'nag' warnings and then it 'turned off' the autopilot and let him ram into the pretty lights...
youtube AI Harm Incident 2023-08-09T18:2… ♥ 5
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugw00L5l_lloQtZc43B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwzFrHyxzhYTM6S8yp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugws1jZI893bMhG_Kbl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx05WZaYxHDPze36m94AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"resignation"}, {"id":"ytc_Ugxn_zfQQ7J57Hd_m8J4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx3kaV7yWQiDojdtjZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugxd6oL9cXZr4qz54p94AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzsAxbDsGhHeDAng5t4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxBnWcUisJu4A8N1rV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwQOszPHHmhWhMAzdV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"} ]