Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I mean just take control right? If it’s 100% automation tell the officer. But I highly doubt the car wouldn’t let you take control
youtube 2026-03-08T21:1…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugxleh7lwRtaoPf9xfJ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzs0HfN-fJ-abm-AIt4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzPbJAEEjBYkv58ykV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgxD28ys9nsWeWY4HLh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxBWJXwEE7CPHFD2SR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzrfoLHremQqD6qWjJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzrEcQbw_hQnMIkUHR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxSJV0wB09O_9OwnVZ4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwXm3FqVvNiraKKWvh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzyNGq0For6FFu1fEB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"} ]