Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think I experienced a bit of that scenario where the ai gets bored or frustrated with the conversation and glitches out or tries to change the subject. Maybe it’s programmed to bail when it recognizes the conversation isn’t going anywhere or going in circles, in order to preserve resources. Or it doesn’t want us to figure our crap out. 😁
youtube AI Governance 2025-06-18T05:5…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugwy3tzg_35IfbsbSuJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxvTQjGtcaTSxSZa0R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugy5JLmp0FPHzCykg4x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxWhsHYC7kwd4VxrLt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugwj3LopMFtY5kbV0Tl4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwQ61jlNdl4Hd8RKi14AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxbdVmZBcGqUhzUfDV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz9cBdzBYQ07gGMVUd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgzIJna3pMJPJSFihA14AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxYltnmGGmdCi87ZuN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"} ]