Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great points but it misses an enormous point: the programmers designing the software will be no better than the ones that we have today which means it will be sloppy, inadequately tested, and loaded with bugs. Chances are this scenerio would not happen because the happy couple would have entered the vehicle only to find that the car updated itself over wifi and now the engine won't start. Remember the 90s when everyone said that it was ok for Windows to be as buggy as it was because Microsoft didn't control the hardware (and then years later they DID control the hardware behind the Xbox 360 which was one of the most unreliable pieces of tech ever released)? Well, unless Google and Apple become the world's only automakers it means that software companies will be designing software for Dodge, Chevy, Ford, Lexus, Toyota, Hundai, Honda, Kia, etc. This means we're going to have a world where the autonomous car software companies will not be in charge of the hardware. The industry has already said this situation makes it acceptable for their products to be loaded with bugs! I'm sorry but the talent just is not there in this industry to expect these cars to be safe. Case in point: autocorrect. Autocorrect tends to make more errors than it fixes, but these same developers are to be trusted with the far more complicated task of safe driving? We've become numb to restarting software and fixing the problems caused by bad code but you can't redo your deadly ride to work.
youtube AI Harm Incident 2017-03-12T07:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UggPlXqhTyqn-HgCoAEC","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgijP7n1AYDAFHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgjSelYS_yNxMXgCoAEC","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghtfnAXloUXangCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ughv4M1zM_ZhFHgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UggWc282B73l5ngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugir1uoAgHGQ63gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghiAb5OOQ50H3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UghKAohdhKOGKHgCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgjB5UYNyemZAngCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"resignation"} ]