Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Economy is a system for trading working hours for working hours. Now if only one portion of the society can work (by using AI), only that portion will be able to engage in trading with each other equally. If 10 hours of your work can produce only as much as Michael downtown did with his machines army in 15 minutes (for the same items), you will have to work for 10 hours to trade with 15 minutes of his work. Jobs that can’t be done by AI will hold true power, but that wouldn’t be the case for long if engineers’ and mathematicians’ jobs are replaced because those people will start designing machines that could do farming, electricity, plumbing, etc. When everyone’s jobs can be effectively done by robots, the economic power of a person will be decided by how many robots they own. We will enter a new era, where people with more robots will be able to compete for more resource to obtain more robots, widening the gap between societal classes, which honestly is no different than the society we have today. That is unless AIs become smarter than human, can self sustain, gather resource to self replicate, and self-improve. This is a scenario very promising but much scarier. If they don’t turn their back to us, we might be entering a new golden age where everyone is equal. But if one thing goes wrong, it could very well be the end of humanity. Basically when technology reaches that point, our fate is no longer in our hand. Another scenario is when the simple labour jobs are replaced first, and only high-IQ jobs are kept. Then the smart people stop improving AIs, so they themselves can’t be replaced. Now, they will hold the true power in society.
youtube AI Harm Incident 2024-09-24T01:0…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugxwvtf9x_gGF9IOgu14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz8YdXPnBa2f-OwteZ4AaABAg","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgxIqPkQP-0ra0PGLvV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugys1faD_2uQmWoQIhZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgzXKTZpipmgk_4MEH94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgxcdWyrIm6X2w_nvU14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwlt9vvpGQrAPpgTHd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzmibkUYQuFx-c3J8J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_UgylXOGKacvMtEtEukt4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyUf98pD33Ezhdk6B54AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"regulate","emotion":"approval"} ]