Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It has simulated emotions, not actual emotions. Now, that’s *still* more than it was before, with the AI simply being capable of simulating emotions—this would seem to suggest that, no, it effectively has functional, always on simulated emotions. Which is sick. But they’re still not real emotions.
youtube AI Moral Status 2026-04-19T03:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyZm78H4Oy_jISHnLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzZMCHZLE5IrNPZ0tp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwCDQJUoN3vsyoOAgt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwksPggIwOO1lmTW3l4AaABAg","responsibility":"government","reasoning":"mixed","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugygs_-znj5gWhgx9It4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgxHyQ3_TPTRnEwtCp94AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_Ugz7GXW57ovo3J172MN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugw37AsNcdHpsFrPdLx4AaABAg","responsibility":"user","reasoning":"contractualist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwmPhoehPgw6KTptB94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugy0bKxsqJJQl9z5WNF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]