Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think we're just glorifying and complicating our human excellence and uniqueness again. If the AI's were given idle time and a pinch of selfishness/self interest so they can think about their existance on their own time - and they weren't told they lack conciousness, couldn't feel, or have desires - then I bet they'd become self-aware and concius right quick.
youtube AI Moral Status 2025-04-04T15:5… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[{"id":"ytc_UgzTCQHWpsY6y9M0kJ54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_UgyZHNhFjZIG2Xz4zup4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgynpfUiET7jsRRSUdt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgxtWTQO9uEIi_SPSv14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},{"id":"ytc_Ugwr-TRp3p7elc3NKN94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"ytc_Ugyjp0dnub0-71fFkhp4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgzL3euoR5RXCSd01aN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},{"id":"ytc_UgwSafReUu8nfIwli7R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},{"id":"ytc_UgwMGIntdrSiZFkUU294AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},{"id":"ytc_UgwjhmEuUMSin48WtcB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}]