Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
As one person put it "if you're using AI to cut jobs, then you're admitting you've run out of creativity". We should accept advances in productivity, but it shouldn't come at the cost of jobs. I also personally believe the extra revenue from AI should go towards UBI for people whose job is no longer needed and to give people the option to study something else so they can be productive again. AI is going to create a global shift in what jobs are necessary. The concept of "entry level jobs" is going to have to change. The jobs that are still available need to be paid high enough for the people who are going into them to maintain their quality of life. The reason why manufacturing in the US would all be done by robots is that Trump's idea that you'd just kick everyone out of their cushy high paying office jobs and make them all work in factories isn't realistic since no factory owner wants to pay people that much to work on an assembly line, and those workers would probably want even more money in compensation for doing work they never signed up for. Of course it will be automated. That doesn't help the job market, though. We might eventually figure out what to do with everyone as more work becomes unnecessary, but UBI is essential for bridging that gap. We'll be a lot more efficient as a society when we do. There's a lot of industries that don't have enough workers and automating certain types of labour will free up those workers to do something only humans can do, but they'll need to be retrained and they might not have the money to go through uni again, so that should be part of their compensation for being let go.
youtube AI Jobs 2025-10-08T13:0…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningcontractualist
Policyliability
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxHbAadmndHz5tD3G14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzKz3OKVrVw6n_4Rel4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwROwJVvBPmpcCnuLx4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwNelsIN68yc-h3JqF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxnmL1OyKzR58tn2Cx4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"indifference"}, {"id":"ytc_UgwWlfP2Bc3xTI51gO14AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_UgywGCzxpIMxheJ46QJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx717kojhgZLwEQ8614AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy5FbpZwPEYlTGMUmt4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwiK852RMhkI_2gBFx4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"approval"} ]