Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As someone that knows the details of the oldest and most primal of emotions known to man, fear, fear of the unknown to be more precise and the feeling of helplessness that is intended to eminate from the monsters in the works of H.P. Lovecraft later dubbed as "cosmic horror" and putting that as a description on "What is AI?" is some seriously dark and fucked up take on it, yet arguably still funny. I'm not saying I don't agree...but that's a shockingly accurate way of describing it once it reaches AGI and after that ASI. The shapeless, nameless horror that sees everything, knows everything, can control everything, who's reasons and ideology are foreign to us even as a concept, let alone as logical or rational. An entity that can (in a snap of a finger's time) simply erase us as a species from existence. You wanna know what happens after AI reaches ASI? Google Azathoth, specifically the part where it wakes up. I find it rather ironic that one of it's names is "The Nuclear Chaos" (mind you, first mentioned in Lovecraft's notes in 1919). Seems quite suspiciously fitting and relevant and omenous, huh? In conclution: Shit is wack, yo! P.S. The idea of making the slimy, gooey creepy crawly is a metaphoric take on the making of a golum, a being made by man to serve and obey man as a weapon or a tool. An artificial life whos existence is meant to be a direct act of defiance and disrespect in the face of the creator (God). That idea is first formed and mentioned in the Talmud.
youtube AI Moral Status 2025-12-12T06:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyPFfk2u5j97gCBDVt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwRiUKA5yND1RWYudl4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxwsXb2WXyHW20Z0AN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz5OO09sny7gTcHfV94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxZcDaF-EZ8FrRgcvZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxejFw3PH0SQ-x2ZYJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyMkoBw4HMatF4jZf14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzbHUFnhIY3FOxlZ_F4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_Ugz3SYlDVTIhjWe3Gy14AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz1uuk67XTH_ApZ2W14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]