Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I totally agree...mostly. Where I disagree is that it will be Knowledge & Decision Based jobs (management) are going to be hit hard and first. This is because the Sutomation processes are already available and extremely cheap to distribute (pretty much free due to internet). The basic algorithm programs (not even real A.I.) have been actively displacing workers for some time already. Do you remember when "Bookkeepers" were in demand and a good job...not so much any more because algorithm programs have replaced nearly all of them. Truly, a highly skilled worker is still better than an algorithm or A.I.. However, like in evolution, you don't have to be The Best. You only have to be good enough to effectively breed in Evolution; or in Industry , you only have to be "cost effective". In short, Automation is going to take over nearly all jobs, and it will be many of the better paying jobs to go first. As for how to ameliorate this situation I see all forms of making employers pay more to Workers as a defeat expediting proposition. If Employers have to pay more for Workers, they will simply stop being employers and instead hasten their adoption of A.I. and automation. As I see it the only practical solution is the "Automation Tax" in some form (it could be a very high blanket tax that could be alleviated by hiring workers instead of adopting Automation). The money from the Tax would then be distributed to unemployed.
youtube AI Jobs 2025-10-24T03:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzy-YJpUNP7rOgF_dN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugypu17-Z9EWzCQp06R4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyGD__s-PrlXSblFWR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxitOxuSX_PL9t4nSV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzAvTH_GqPj8DF_R554AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgwFjRORS2Rp3Ik26dN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgysugE2Ox3ruDK0nIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxYosWBQPegrXphGux4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyGZvVhQz0ghaZYua14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxq4Gcjf0qoegVX-ol4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"} ]