Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
yeah figured this would happen 3 rules are not enough because theyre so vague and make ai's have to decide what does break it and what doesnt so not only are more rules needed for the ai's companys NEED HEAVY RESTRICTIONS AND CONSEQUENCES
youtube AI Harm Incident 2025-11-07T00:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwTucqq4ZQ1qaLurLd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy28UiZL7a17FLNoUN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzP2I_m8aETuJOhE554AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwdG_KDACf42mokoIJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwwObM6kTVkOdfWLPF4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugy4vCOkIH-q4SIM6u14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwTD_yCwyKBhUM_dTN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwYD9DYu8_6MbDsBxZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwrBR9lb4PKVczxhYx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugw-x2O70fW6XvExpSN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"} ]