Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Humans have invented nuclear weapons which can eliminate humanity and because of it the humans have learnt to avoid wars. There was a lot of wars in the Europa and there is the European Union now. I think people will understand on time that they have to make the AI to be only an instrument and not to be capable of independent thinking and not to be able to exist without humans, not to be able to exist as independent person. ________ But we live in a period between two stages of human development - as it was for example between Feudalism and Capitalism. It is sth showed in Star Trek or defined by Carl Marx as the transition from capitalism to communism forced by technological development. According to Marx AI will eliminate profit maximization as a motive for human activity. Efficiency achieved through AI eliminates capitalism and opens the way for humanity to move up the Kardashev scale. The Kardashev scale measures a civilization’s technological progress according to the used energy: ____ Type I Civilization is using all energy of its home planet -- in our case Earth ____ Type II Civilization is using all energy of its star - f.e. our Sun ____ Type III Civilization is harnessing the energy of its galaxy - f.e. our Milky Way ____ Type IV Civilization would be able to escape the collapse of our universe by making a tunnel to another universe. There are theoretically millions of civilizations in each type. In other words will the humanity learn how to use AI as an instrument and to avoid AI as a person and the Capitalism will disappear even before we become the Type I Civilization. .
youtube AI Governance 2026-02-08T22:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz_hO2JUA2C27HBgLJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyJpIyYM4Aj5DgyYoJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxt1nc7cIWi_iQHrCZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxBYIReoBGybQ1xme14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxjFvMZQ29vySDF8Wt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwEhCoOzsQvr8t-Krl4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyAG9y5lARy1gySj_B4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz7MBPwoHvELJSfaHh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyfvMMDXyzg8_HJpEd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwCqvGHz2CQuuSbb3d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"} ]