Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"there was this article about hallucinations in AI.. do you know what I'm talking about?" "The article by Mustafa" "I bet you're right. I didn't read it, can you summarise it for me?" The conversational tells that someone has been using AI too much.
youtube AI Moral Status 2025-10-31T19:2… ♥ 3
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwKmmPGxX0kbQuFEa54AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMBNzPv0YV5wk26Ll4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyQBf2-ySXEmEPDvGV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyU52dzUJ0cP6uLeut4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyzUj23QPCSQQm3bkJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwznKZMqydHEd20M0x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy6fUSAOw28Pw25Lrx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyfqT-dDAHuv22h8fl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugwn4J8GVJfdW0tbAgN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzlPWOP0shh9ZTXadZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]