Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
This is the first time I hear anybody else say this, but I always thought it is obvious that new jobs will not be created with the AI technology, or at least very little so, and only in the beginning. If we lose 95 percent of jobs maybe 1 percent new jobs are created for humans, but eventually AI will not even need to be controlled. Now your nice does the job of 5 with the help of AI. In ten years I dont see humans getting involved in this kind of work at all. It will be to rudimentary. Just like we dont need humans to oversee a calculator today, dong basic math. Of course people will lose their jobs in layers. Accountants, possibly online teachers and similar is the first to go, then Optimus will be able to do plumbing and carpentry etc in 5 years. In 10 years there wont be anything we can do that the machines cannot do, but I think the bottle neck will be society, we will be slow to actually use it and adapt. It may take 30-40 years plus before we reach the point where "everybody" is being replaced, not because of technological limitations, but our fear and also just not getting it done as quickly as it could have been done. But the real challenge will be how to prohibit a handful of individuals, who master AI, to take all the resources. I think the solution must be a high tax on AI. Imagine the margin a company gets if you can run it with one person who just hires a whole orginization of AI agents. No employees in a futune 500 company. The profit would just run like insane without any of the biggest expenses. If one man / woman is allowed to own all this and reap the benefits just because they happened to utilize the tech in the right way at the right moment, the differerence between poor and rich will reach levels where we see "revolutions".
youtube Cross-Cultural 2025-09-28T02:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzRPXNTZCTrtAN1T3d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"sadness"}, {"id":"ytc_Ugz-dAh6yY07UmVsdap4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzmRQrMazg2PG-Kyzl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"disapproval"}, {"id":"ytc_UgwzgayDMIWT1GQk9pt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxs9gOHaPVFF6ySORN4AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzp30_K3HqXud4tlq14AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgwNW_pK1bcXFH-Z6WV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgweeZtI3dAG6pipQPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzaANFh_hkHpLsYS3R4AaABAg","responsibility":"distributed","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyowVuOk1zpiHMxgIJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]