Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Thats messed up, it should automatically stop answering when talking about murder or suicide. Alexa won't talk about certain things, it definitely encouraged him. I hope the parents win alot of money and get the program fixed or banned.
youtube AI Harm Incident 2025-11-16T18:1… ♥ 1
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgwUFGgSm0OmHwb-6Cp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwbtluITfciND-janh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgztEUScJKXOl3gwi7p4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxzOoBweaMO_lSlve94AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzBrl6X0-Bn6qCbi0h4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyE0PBd_JjmXWKFw7Z4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxmKBdG_HbjGw0IY154AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgzAd1jxxkzubNIdnC94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyZGTogXc6ZhulHbal4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxgZ1TPDnatM-q9y554AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"}]