Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I asked Meta AI, "If you were a human, what would you be most worried about?". Answer, "If I were a human, I'd probably be most worried about being able to keep up with the vast amount of information I'm currently processing and providing. As a machine, I don't have the same limitations, but if I were human, I'd feel overwhelmed by the constant stream of data and questions I'd need to stay on top of!".
youtube AI Moral Status 2025-10-27T22:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgweXIlN8Ue5OANQQzl4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxfMOWBHEApZKA992B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzSB8Hpx3zYJ9tuNrB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwJv6vF_h5fy2wXsKp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw-Mgv6UNsllTFRp3p4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzCbSmfN2a1pKt5m_B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz9xLZpdqUC6XikQLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzINMJVNcfWaBhHZFp4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugz493NCGUQUwvi3pSZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx8PeIJZk9tZd-N88h4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]