Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think you should cut this guy some slack. Given enough time he could probably understand, but he's looking at it like a programmer, and that is just the thing that makes an LLM not an AI.
youtube AI Moral Status 2025-05-10T02:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningvirtue
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_UgxMy89HRnExuJ4MrNF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz795Kv3e035GQQSDh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwQpTdLHKNnVZ51kMt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyvOUq3uf9pvo-Fnul4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugz2Jev5t-z3Hl_LHRx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxKHlDHrHvceSKKJV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyPCXwdRJSAN8yuJX54AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyp1XttLmGHljbOxGt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw2lEFgOXhCZ5kwAFl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgwUeLwnNjyDeGGnzdh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}]