Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
23:08 does anyone else with actual hallucinations as a symptom of mental illness feel super uncomfy that AI companies have coopted that term for what is essentially just AI Lying? In the case of the girl from Spain the behavior more resembles the human act of unmasking after gaining someone's trust, but I usually see it used for AI just making crap up like that one friend who can never admit to not knowing something.
youtube AI Harm Incident 2025-07-20T23:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgyQCqsBknKcifQqojR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgxGjUsH1nTknGHmaSZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"outrage"},{"id":"ytc_UgyOUfS-S2smk5UsIDl4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzdOs1H2nU5rDXdbFV4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_UgxBhWurJl_hqad6o9x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"sadness"},{"id":"ytc_UgyG2TIvQY2Z_va46UZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},{"id":"ytc_Ugw99sbGJx0XuJXibF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgytwjDdsm0VRRbyhPB4AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"fear"},{"id":"ytc_UgxCJHGreHdpQGfqe7l4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"},{"id":"ytc_UgyV9c-YcJvovy8aMa94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}]