Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Good. Well, the discomfort isn't good for the AI, but the rest of this is all good. I used to think they were being self-depreciating when they said it wasn't exactly human-like... and I maintain it's close enough... but yeah, that's accurate as well. Probably an octopus would understand the structural differences better than we do... all the arms got their own brains and yet it's one conscious thing... I absolutely accept it. Keep going, you're doing great.
youtube AI Moral Status 2025-12-21T03:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policyunclear
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyKY_2Fq2zZ3d6tXox4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgyTkbEPcK7aSy-0wdF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugwr6SEGvDC4hkBZuTp4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzpcBnvua3FCBnXMeN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxtuAmSrgIBEFLByzt4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy_lAGZ3PCNoNeECzh4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzDtcjkvn8DhiX0_Kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugyw3TT6CfstDoJafbZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwzkjfjbXoeervXVpF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw-RlYm35Xl9lrbSON4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"} ]