Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There shouldn't be, in any way shape or form, the ability for a semi-autonomous driving assist to ever break traffic laws. But the next step is not having the general public have access to and ability to avoid the safety precautions of these features so nonchalantly. If you allow the average morally ignorant individual to use these features they (the cars) better take every-single-uncertainty as a chance to inconvenience the user and not fatally contribute to a passerby's life. If you "auto-pilot" and the computer detects anything uncertain at all, and the driver doesn't respond, it should flash its hazards and slows down safely or something. Also having auto-driving should be its own endorsement or license. If you don't take a mandatory class with the same severity of say a CCW or CPL, you shouldn't get to use the features of any kind that allow you to slack. It's infuriating that lives are lost because automakers decided to push to market something so volatile before legislation could respond AND name it deceivingly to sell better. It's one thing when it's phone software or household luxury goods. It's another when it's automobiles, food or anything kid related, etc. The average person can not and should not be trusted with this. How many vehicles get sent to the mechanics for ignorance and negligence are a testament to how the average person perceives the severity of using a vehicle. It is the easiest way to achieve so much risk, driving. Auto-driving, that isn't damn near perfect, is a step above with the human factor. God, I'm mad... Make the screen say "you can kill someone, pay attention" outloud, every 5 minutes or something.. at least attempt to fix it before being forced to!
youtube AI Harm Incident 2022-09-04T08:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwihHtRtipqExayxmp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxeMiGEGA4wlbEQZFR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzyFEqzRE-YZA1JCGJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz5uzt3tMk4_ujEk6l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwnQSDmqVprrxQ_d214AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugze2053bj_OrYlRSgF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwaByERyBw9mKNpFQZ4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgznQHRyv0bFZ2H3DzF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxIG76mvw5gt-huRoN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwzPGpyVPpLylD9gjN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"} ]