Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This scenario is highly unlikely. First, I have yet to see an AI that can do wortwhile work autonomously. I try constantly to make AI do work, but it needs to be babysitted the entire time for it to produce usefull results. I takes a lot of human effort to make the AI produce a useful output. This creates as many jobs as it replaces. Second, when (if?) AI and robots can do work, the price of production will go down. The cost of living will drop. This will make us wealthier, not poorer. We can have good lives on very low income. Third, in a democracy people will vote for a system where taxes on companies and work done by AI and robots will fund a UBI. Fourth, when AI and robots can do meaningfull work at a low price, everyone can get AI and robots to work for them. Anyone with an idea or an unmet need, can start their own production. When every person get robots and AI at their disposal, they become self-reliant. They can can order their personal AI and robots to produce for them what they want. Surplus they can sell. If AI and robots ever get to the level you imagine, the result will be abundance for everyone. What we call «work» today will be optional. Every person will be managing their robots and AI. Every person will be like royalty, with plenty of servants to do things for them. The royals don’t have to work. They just tell their servants what they want. That is a much more realistic scenario. You welcome.
youtube Viral AI Reaction 2025-11-23T14:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxjM7y0Z1hFpix3cXd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxSb2tB-T0MasBEoNx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxJ-_R_EcLpXeU25Dl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgwloMIPMeRdkA2LmYt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwEIDwU0AEfI8ZuPpl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugyt54qCgH_jabOrVIp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzTCfSefLHwaNK2F0Z4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzSQVtEiOew3jnRz2J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwpvsKaSrSw3xnb_Q14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyWWNtukNqdnyQ1Kt94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]