Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
TL;DR: Either that Tesla (if it's complete/true self driving auto pilot), or the human driver (and evidence suggests that Tesla AutoPilot is more marketing than reality) should've either fully stopped (also turned on all 4 blinkers/hazards lights), or switched lanes ... Either/both well ahead of that sideways laying truck. This comes from an electrical and software engineer (electronics and programming), actually born/grew up in areas where Nikola Tesla came from. Who's grandad was real-world "Transporter" (except no Jason Statham type of fighting - grandpa was originally "just" a car mechanic, which was a prerequisite to be a professional driver, and later he wore a suit and drove highest government officials). And I also happen to drive a lot - besides regular everyday driving also 4-6 times per year I drive the majority of ~1800km drive across the Europe. While I love distance/speed/lane maintaining cruise control and all that - I'm not going to rely on "auto pilot self driving" cars until technically much simpler things - e.g. all the planes driving/taxi on the tarmac, all the trains and metros (this is already the case in some places), and then all the light rail/trams and finally buses (those have tracks or dedicated lanes - but there might be a person/car/etc in their way) are fully autonomous. I'm confident that by then those fully autonomous vehicles will not use just cameras/AI-Vision for detection. Because even when you can control all the variables (no other non computer controlled vehicles on the road/track) - there are always birds, animals, snow, fallen leaves, construction ...etc. Because even "dummy" VW Touran that can only maintain speed and distance from the car in front (doesn't have lane keeping) - with basically no AI - can reliably detect and emergency break/stop before hitting things that Tesla's "Auto Pilot" seems to keep crashing into.
youtube AI Harm Incident 2024-12-14T18:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwO1DwBUKrXUTVCJLB4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyIxKaWNvuje9mLEnR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugw38bHPL9gHZuyZFRh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugzlcdppz8gsJEhC4T94AaABAg","responsibility":"company","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgwgNe4trca77ldco1p4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"resignation"}, {"id":"ytc_UgxtfYY_ZuT0_3OAidd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugw0TXVAD76SmnkTqxp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwpC-0bLmiosdzj76p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugx-6dulXef4tjTkbxd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyaBb03d1K8bS2xOjh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]