Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
my fear with AI actually comes from the fact that i have 0 faith in humanity. we complete trash. if you don't believe that, you just haven't been watching what's happening in sudan, haven't been watching what's happening in palestine, haven't read about the holocaust, haven't read about slavery, etc etc etc. these were not 'mistakes'. these abject acts of per-meditated murder and cruelty that everyone knew was wrong but anyway we are the perfect weapon system for the AI to use. there is no need for the AI to go and manufacture some thing that wipes us out. we're already here and already willing to be bootlickers. why even get rid of such a useful asset? 'it will kill us all with nerve gas' or whatever seems so much less likely than 'it will just take over and humans will become a means to its end. i mean, FFS, THIS IS ALREADY HAPPENING. WE ARE ALREADY DOING IT. so there doesn't even need to be a paradigm shift - just an escalation of the current status quo at the same time, fck it i guess, i can't stop these things and the polity has already decided and i am asked, 'what right do i have to override the will of the polity?' so i guess this is the road we've chosen, so i guess i'll just have fun using ChatGPT until whatever end result because i never had a meaningful choice anyway
youtube AI Moral Status 2025-10-30T21:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyjxE3ed0-cXL54FoN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgwZV9HVtUByR0zeelx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyKzqdR2kM7HQ3gO1t4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyIognEwomLuypLOcB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxszYggu5E0cMVPBa14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyJGvcDzlxp7A9ZQEl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy5MfxipwIc8Coqa-N4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz-8lEQo8xo8Ulw5Z94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzmPJ7kHypbtvukkSp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz1Jp7u5tsO91sycdh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"} ]