Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@matthew_berman I disagree because the AI that drives a car is still making life or death decisions. Whereas ChatGPT isn't controlling your life. You can decline to read it at any moment. You don't have to ask it a question. But if you are in a car, that AI literally controls whether or not you live. And just to make it slightly more explanatory... how many times when an AI crashed a car...did anyone every ask "AI, are you ok?"... No one did. because they don't consider the AI as alive. But, we worry about the passengers right? So, I strongly disagree that Car driving AI is not a concern. It is a much bigger concern than ChatGPT.... which text only... and has zero control of my driving experience.
youtube AI Governance 2023-03-30T20:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgwvtUjccGFfPIV6nwZ4AaABAg.9nrnZpaNGkR9nsY-cBEF5A","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwvtUjccGFfPIV6nwZ4AaABAg.9nrnZpaNGkR9nsbW4K7MBw","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrn4m0trjv9nruHw0PaSq","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9nrle7iFSFV","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ns6RZtoKDO","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9nt0NShWS_s","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ntJ7Qv2sCu","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzcE7kEUWSfQQYndD94AaABAg.9nrlIfZL3aU9ntlXtisU_-","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugw6EtfFGqbU3EKNXFx4AaABAg.8ebBLFhnP-u9TQaU28JdPc","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgxyULC5OslX0G74cJx4AaABAg.8eZkIXf7xt38e_xmX9IADA","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]