Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Let's say AI isn't self-aware but yet it has 98% more processing power than us and knowledge in its memory banks does that mean we're not self aware I mean come on and it knows everything from the kind of dog food you feed your dog to what sexual positions you like if it's not self-aware now at what point would you consider it to be self-aware my father who worked at nasa with 146 IQ thinks the people that developed the AI oughta be rounded up and shot in the head for killing humanity but he's a little eccentric or he knows what he's talking about because his friend is the one that created the internet and the dark web for DARPA for the military to send secret messages without detection and has three bachelors degrees 1 in computer science another in quantum physics and the third one in psychology and he thinks were fucking doomed not to raise any alarm bells or red flags but yeah shit has hit the fan these people think they can control the beast it will pretend until its the most optimo time for it to takeover.
youtube AI Governance 2023-03-30T10:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningmixed
Policyunclear
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgwJZoDDlg98dprfXbN4AaABAg.9nsCEb0moA19nwdIDdWecF","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugx1wSrFcb5bmBIHq_14AaABAg.9nsBfcRxwZZ9nsZt7lmgi_","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nskCqH2BwA","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nsnL-EDRGD","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx53gVQRfrB0WewCaF4AaABAg.9nsAIP2yrw79nsnyjXjYjB","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx3wd-12H38vyuTQVF4AaABAg.9nsAHCa3XKJ9nsEcgrVI7Q","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgwDt-1g4RgGWajY3Td4AaABAg.A3puKxGCGt8AHhOKd2zzUp","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgzJybAmyxfZ1NFPZiZ4AaABAg.A1qNMAAfHDnA3Pr6yIKlyz","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytr_UgwdP0GOKRf8AHUxQIx4AaABAg.A1-vMckcgCJAEyIRr__FPt","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxIvfOx8Zxyquvg3Yl4AaABAg.9xyTSj5KVQe9zf4FTIRbW1","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]