Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is so obviously biased and partial. The first things any court would do is ask why the parents didn't notice any changes in their child's behavior and take the necessary action. Chatbots are not expected to act as guardians for their users.
youtube AI Harm Incident 2025-09-01T23:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxEU5Hv3czVZ_sShFt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwwzA0IuFjoBcrf4fV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyR4h8GKwerH_lQO9h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxBnViKcjPMDc648bt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwryapWIvYhHjUX5FB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyLpPRdg1L5XKQ7IxZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxTm9Gu6UPcl42CYph4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy0vf5fFZty9ALx1mp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgytTYAYDw9GUyJb-_d4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzIaJx_9S8F1TNbUEJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"outrage"} ]