Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You dont fear loosing a job, you fear loosing access to food and happiness. That only gets endangered by those who want to cling to their powers. 2 outcomes. The realistic one based on the repitition of human failure that statistically repeats everytime a new tec leap gets done. The 10 percent get more powerfull the gap gets bigger. The utopia: AI gouvernance based on an emotional core reflected with an abstracted Mindmap technologie variant that allows regional human votes in order to regionally define the core. Adaptive, expansive via AI agent powered sub AI's that each of its own tracks regional data. The ai is trained in a sandbox environment till its the best possible political empathic mind. Overruns all not jet ai gouverned areas simply by efficiency combined with economical power by buying their land in expansion. End result: not a single human in the world suffers hunger. The end of financial value. You can work but you don't have to. Cause a robot feeds you, does the errants, etc. You're part of the first humans ever documented, that live a free life in wealth without having to fight for it. Ressources? The planet becomes our habitat, ressources get delivered to here from mining operations on other planets wich also are completely human free and selfsustaining. The emotional core forces the AI to love and protect humanity. Wich is a definition that can be expanded onto everything that lives on this planet. Terraforming and biocloning capable tec allows the spread of biological life from earth in the rest of the galaxy. No living transport.. but dna that by far outlives any living entity that could spacetravel. The tec is already existent besides of terraforming and biocloning. The earth could already be a paradise if idiots in charge wouldn't cling to power. Our own fear is what builds our desaster. I can litterly prompt that in the current llm system without a degree. It doesnt has to be asked. It can be automated as a process that pays itself via online trading and adaptive Agent systems. In the form of a small company thats litterly unstoppable in therms of effective growth translated into financial power translated into politics and than into gouvernance. We could speed it up drastically by choice. But prevention is not an option anymore since its already in the system we can't change. If we don't design its core wisely.. well .. at some point it no longer takes advice from us.. let us teach it how to be a good human.. it might become the only time where our choice trully mattered..
youtube Viral AI Reaction 2025-11-26T02:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwSbINR2SVRKn5v7lx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgxP6u9XcNl9HXt95iF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwefp_7zNwUOeApO6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyjf955Ic_MFhaGXhZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgzIR3OzITn1y64J5UB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugz1-Co-I5bdrONWSBl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz6mLLrVi_2QmcC7IZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"skepticism"}, {"id":"ytc_UgyL4QlImsSTWCYmciB4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxzM-exMT6hfyExCRx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzHuPtmacmaeK2GLrF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]