Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Of course there will be people who say well AI is not replacing my job (plumbers on site tradesmen) but just because your job is not going to immediately replaced you will still be effected by the impact. As who pays for your service may just be people who did lose their jobs. And then the few jobs still done by humans will mean More people training to take those job and directly compete with you for fewer jobs. So wages will be driven down in all labor jobs not just the ones first to be replaced by AI. Then when robotics advance to a adaptive level of human skills then there will be even fewer specific specialized areas where a human can still do a better job or at least be more cost effective to hire as it’s a one off situation that to build a robot to do it would be cost prohibitive but any job in which you do the exact same thing over and over even non ai automation has already reduced tye number of people needed for labor. And AI at first won’t eliminate every worker but it could mean one person can do the work of ten. So even if you keep your job dont expect to not see the effects of others losing theirs when your talking millions of jobs being displaced.
youtube Cross-Cultural 2025-11-28T17:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyiQnyx07jKd-zqayx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzNXFcT14MdOczma_R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxVg8cCGpS0l56Dkn14AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_UgzMKcyChpKykSPyBQd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgyCZeAx70AzFd6w00d4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgxrRlNdNlg0XIrbhrB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwY4cNwiv4wOjCEepN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx_tXWo4U7d7IK-pFR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwNjoUVivyDhl3uash4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgxcvkRhWbAbpDwt5gV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"} ]