Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I wrote the AI for ATM machines most banks use to optimize their internal and external logistics. When should they be replenished, and by how much and by what method. So I know something of the subject. When it comes to "self-driving" cars, my prime concern is the "objective function" with respect to its existential philosophy. Will it sacrifice itself to save a cat or dog? A child? Even at the probable expense of the lives of the buyer and passengers? If I have to drive off a bridge and die to avoid hitting someone, then the person is going to be hit if I'm driving. An AI could easily make a different choice based upon the "decision weights" it has been initially seeded with. I will never buy a truly self-driving car (IMO they are at least 20 years out unless culturally we are willing to "learn" just like humans did at the dawn of the industry. People forget how awful car accidents used to be.
youtube AI Harm Incident 2022-09-03T21:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningcontractualist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugw-ZdR--2Vx0krS-yp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx6CbqyCoJV9VZAHqp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx_VF7ouu_c4BQCcjB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugxr3KETI0jxrkeuK2t4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzMuZxDfaC23kZiXY54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyXKsnW-5E_prR2eAJ4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwA4Urx2Kap4bd-lox4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx1oQpkJoVjhejiktl4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzlSFViqLTok2aEOQB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy3STGn4vRbXyqHx7Z4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]