Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
https://youtu.be/V2u3dcH2VGM?list=TLPQMTAwMTIwMjVL16cMoKjH9Q&t=163 that's the main problem with humans interacting with digital driving systems as a whole instead of reacting to visual input from your own eyes (stimulus + reaction) you now have to monitor two things (three actions instead of two) - even if its meant to help , that's a distraction, because either you trust autopilot completely, or you have to take the extra step to filter the extra information and/or compare to the real world (4 actions) i don't even take calls while driving, because i want to be engaged with the road ofcourse if it would be more like augmented reality, with a hud on the windshield, it would feel much more natural and user friendly https://youtu.be/V2u3dcH2VGM?list=TLPQMTAwMTIwMjVL16cMoKjH9Q&t=255 in this case however, it's clear that the machine learning wasn't trained on enough possible contingencies "autopilot" only recognizing a stereotypical boxy form with two lights as a car is like playing a deadly ps1 game in beta, far from the futuristic intelligent system Elon makes it out to be not saying the design is at fault, but clearly the models were insufficient with all the intellect gathered at these companies, not one programmer thought it was a good idea to plan for the worst of conditions?
youtube AI Harm Incident 2025-01-10T11:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyregulate
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwIGm-P-GlgZfLfSGJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy-ah4dWg82e2XD3AF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwOO2dQ-OxJIJKb7eh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxCEtyW9TOm2turxDZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugwxie838pcP0MygAph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw8_5Lo5l0a9Yn95Gt4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy7hlyjlE4MHOHdLpF4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"}, {"id":"ytc_UgxuzHjeZM5ExcnbWRd4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugy16S_ezLM1VYIlU194AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxlCVcMIH6dO6KwpB54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]