Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AGI is a nebulous target, and in some ways a pointless one. Some very smart people who work in the industry think it already arrived late last year, and those who think it will never be achieved are deluding themselves. But it doesn't actually matter in the short term. What people are missing is that AGI isn't going to cause the avalanche of white collar job losses. Effective _agentic_ AI is, and it won't need to be AGI to do it. AI companies are working tirelessly toward the goal of developing agents that can replace multiple human workers, because that's where the money is, and corporations will be jumping at the chance to cut their labor costs by 99%. This is going to start within a year, during the tenure of a president who was more concerned about striking Biden's legislation because of the name attached to it than he was about what it contained - and one in particular was an AI safety initiative. There is nothing holding them back. Unemployment will reach catastrophic levels before the slow gears of government implement any sort of meaningful UBI. Once AGI is heavily involved in robotics and materials science research, blue collar jobs will be the next to fall.
youtube AI Jobs 2025-06-18T02:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy5ymXRH857F79eziZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwLsCulBcIwYuhlSRR4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxnUY7sakrSVd_MuTN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxCGmxRTdUKzaXFIJV4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwTzb7TdekguPl8arl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxfREGlJPcL-hMO_oN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyOrEQDxiayXNJV5dR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzYuVNbE7aihTsp4Jl4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgyHFeJrs7GAnT_O97p4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugx6krMrRI9kfJR1Bhh4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"mixed"} ]