Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The problem with AI developers and especially CEO's is that they have every incentive to be the first as they see it as a winner takes all scenario. They have no restraints or laws governing them, and they are so rich they probably have ways to insulate themselves from any problems they create. All this while we "common people" can be determined to be non productive to the development process and therefore unneeded, we can be seen as just so much drain or resources. That would include most of the world's population, probably 99.5%. Imagine, you wanting to retire or have just bought a house and have a job to pay for the house and car, and AGI stepping in and replacing you at your job and you can't even get a job flipping burgers as those would be deemed unhealthy. So, you lose your house and car and if you are retiring AI sees a better fit for your money in more GPU's or cooling or power and it strips away your savings, cuts are reroutes your power to itself and then to cap it off, since the power has been rerouted no gas or diesel for your vehicle can be pumped, natural gas can't be pumped to your house, all food production is halted and medical services are curtailed. This is sort of the scenario of an EMP going off frying our electrical grid. Over 90% of the population of the U.S. would die off. Not only that, over 90% of the rest of the world would also die off. Just so the AI CEO's can be first, and without safety restraints. The ultimate of authoritarianism.
youtube 2026-02-10T17:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyYh-Ot8JR5oBJXY4Z4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxIMHL3MX4xzlUgGdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwe5N036ZVfOGHJ-kJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugx8X7TeNXBaCilip954AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzgWqqpA7rrfhU4Lq94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzviCTevUvQ-LFJx054AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwTeukz7AIo-mWXD0V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwzclpZjji0_tiY3xp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwPsDm0AnSpFD0-_5x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgxCl3iYByl-rjAxeyh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]