Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
if he can’t follow instructions i’d say good luck and hang up! like it’s not her fault she’s just an employee trying to help you. idiots shouldn’t use self driving cars.
youtube AI Harm Incident 2025-01-15T02:5…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugy_AvejDGJ-iNAQkQh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxnrIGGUQLKEUUGKnl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxRgdBCfzFwZDjZLK14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxuonlJYObdD9UeH5x4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwxmPEZXDA8fOFeg7J4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy-qSH4iHvk5gNlGKh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwTPj7_Ah4EJ7U0pEt4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyViNI6adBHKwZliM54AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzrzZ-tSKQqPQfkKrh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyij91NeSX-jdxVt-54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]