Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The first mistake with autopilot, was calling it autopilot. I know it sounds cool, but it directly implies the car is going to do the driving. Not an excuse for drivers to be stupid, but i think they should’ve called it co-pilot. That way, at least, the name implies its “beside” you, not instead of you. Edit: I watched to the end, and i decided to edit this comment rather than delete it. I’m glad the video maker and I are in agreement.
youtube AI Harm Incident 2022-09-18T05:1…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningvirtue
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwX4L5jOHz-DAeI_Vx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwkPBA86NFZ946zAHF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxDtX3iXdmxyqX3eb54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzTgcybaoLq1Cu1JZ14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgzXAmPV6kdGjKFzVN54AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyncEHiSuuwnFmAMmB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxKu8FPBmGXf2gOY0p4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw-5uRdPqob_jtc1th4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw5NlqhX3Z0yJgQin14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzLjdlh01I6hXu6Np94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"} ]