Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I just think of it this way: If someone jumps out in front of me in that situation, I'm not going to write an ethics essay on what I should do in that situation, I am going to freak the fuck out, slam on my breaks, and probably swerve. The car is the only one with the luxury to make a moral decision, because it's the only one with the capacity to observe all of the variables. However you slice it, automated drivers are better than human drivers.
youtube AI Harm Incident 2014-05-25T23:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UghX3ONnXBqe8XgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggBb4ZNQvsUxngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgiB44HQOpV9QXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggyaM-IUg_XuXgCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgiIlLAil_Zq03gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgjU67dU5p4ZfXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UghsmoHmvmSdbXgCoAEC","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ughie0wJ93N44HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugj0eYYZq3WVp3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugg5q94oNYv7l3gCoAEC","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]