Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Unfortunately it’s unrealistic to think that an AI would understand that being told to stop is an actual request and not just a part of the story its programmed to follow. If you cant understand something like that, you are not old enough to have access to these websites.
youtube AI Harm Incident 2025-08-05T04:2… ♥ 15
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyliability
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugz_r-PFWc5Bky9PIM94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwPGu8liShSKtKmoFl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxHxV69LxNPPC2-SNF4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyaVLvK2MbMS3C02OZ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwW71kG3Xtf8lASpSh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyT9ELrqn4RnpKoCMZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyh--ke1vpe2GMEZnd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugz4H81kAJL3eMv5mTh4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzpOBY6Ej23XIi1DKB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgwAl5fjhs1EdaxpX4F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"} ]