Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Confabulation is actually a really good term for what AI does when it makes stuff up. In some brain injuries (most often with severe alcoholics) essentially the brain cannot retrieve actual memory or information so the person will approximate what they think they should be saying. They aren't deliberately lying but unable to actually recall an accurate thought. I've met a few of those patients and its similar to what happens with AI
youtube AI Moral Status 2025-10-31T07:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzsKBioXFrB8Xi7-8x4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugwid8bmG_g2GdlAuD14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugws9sfRXx301Cd8ChF4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxV_YHQCHw31eZyceV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxgeR_zdgm1SNCilZR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxlaTTwEVfCbQwj21p4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzBzszkguA4vKkRVq94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxPQGbP5vRzUX5bx3R4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzPyLcmKVdlwcAsChF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyOM8-_8L_1ct1gW4B4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"} ]