Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The issue with suddenly being a manager of agents over a task is a whether there is, as he suggests, pent-up demand for it or not, and whether that pent-up demand is great enough to keep the demand for humans the same. Radiology is the example he gave, because there is a value-add that the human beings can bring and pent-up demand for medical imaging. But if you are a warehouse worker, a robot with AI can replace 99% of your coworkers, and whoever managed you will now manage your robot replacements and accrue the benefits of whatever pent-up demand, not you. Same with the dock workers - containerization increased demand for shipping and did not reduce the need for workers? Fine. But robot AI dock workers could replace dock workers entirely, and there would not be an increase in the need for humans anywhere else - the dock supervisor now manages a fleet of robo workers handling the increase in shipping. And shipping will increase without a doubt, if you can lay off entire crews of human workers, quit paying salaries, and just keep the supervisor. In both cases it's not AI by itself, nor robotics by itself, but AI plus robotics that disrupts. This combination removes the need for human labor in any job that is primarily physical with some cognition required. If most humans can command a fleet of robots, we'll see more jobs where humans leverage private or shared robotic fleets to carry out economic activities. That is the future.
youtube AI Jobs 2025-10-29T13:3… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz1VZRgGrTODPA5Wxp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwTklUvJ3lYBSWey2h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwIaXLvxnbmZfrY8yt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugxo299dphZZPW5UdGt4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzhkLoDms5yyXZDxj94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyqUwjyyWEAbq3e6Ed4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwtIMJYZSj1NQAP3NR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyre-aYvaliDme1jBl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugy2n0wtZ5u4SuwD3X14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"mixed"}, {"id":"ytc_UgyveKqoGadcCFH67Fx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]