Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have been working in the car business since 2016 and I told one of my friends that you will NOT see a Full - Self Driving Vehicle (Autonomy Level - 5 or higher) in our lifetime. Why? A variety of reasons: 1. Technology, robotics, cars, A.I. all experience "a death" as well, except their death is called: "Obsolescence" 2. You will see Drug and Human Trafficking increase to insanely high levels 3. You will likely have higher insurance premiums, because you're more of a liability if that thing malfunctions. 4. Speaking of insurance, I am sure you're auto insurance plan will terminate the moment that extended warranty expires 5. Similar to the "Volkswagen Diesel" Scandal, I wouldn't be surprised if the car was programmed to automatically disengage if an accident is unavoidable AND make sure to automatically delete any data associated with said accident (e.g., erasing dash cam footage). 6. I'm sure somewhere in the "Terms and Conditions" it states that the automaker is not responsible if you get injured in some way, shape, or form. 7. Surveillance Nightmare - All these cars will have all types of interior and exterior cameras that will be recording everything it sees and hears (incl. You) and you will not have access to said footage and the company will keep the footage for a minimum of 15 years 8. All that self - driving tech will inflate the price of the vehicle to insanely high levels 9. At Level - 3 Autonomy, the car manufacture will assume responsibility... which is why a lot of cars brands are hovering over Level - 2 autonomy; some even level 2.5 autonomy
youtube AI Harm Incident 2026-04-25T04:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionresignation
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz_Do03kXRNZXy2Q2J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxrq0mfbmFLf5JaKy54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzU1ugdrbrMCOErFS14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwLGKaB5RTNWgBoDpJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzQszlgreWqJyNnsv94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwL3dPqgV74b4P0Dj54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwfz9sshphchim_KfR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgycfADOLtTxQ4wkneF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugymq90hkqyuD0zCKNt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzQOjyaLnvj-ItC1QJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]