Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
At some point a Universal Basic Income is going to be necessary if we want to prevent societal collapse. That said, on X date my hypothetical transportation company will no longer need human drivers, AI will be able to perform that job as competently as any human without ever getting tired. So my hypothetical company now consists of 40 drivers, 6 mechanics, and a dozen other managers and support personnel. will begin phasing in self driving trucks. I will need to begin letting human drivers go... however, it's not going to happen all at once. I'm also likely going to need more mechanics to handle a lot of the maintenance tasks normally performed by drivers. So 10 years later, my company will likely be around 8 drivers, 12 mechanics, and a dozen other managers and support personnel. That will be a roughly 40% reduction in my work force. There will not be any work for people who want to drive trucks for a living; however, there will be work for people who learn how to fix trucks and specialize in trucks that have the capability to drive themselves. The bottom line is the job market is going to look a lot like a game of musical chairs, with a higher and higher percentage of workers left standing. The key to getting and keeping a seat will be (is in many fields already) predicting what the market will need in 5-10 years and getting that training. That said, even if you have a chair... at some point there will be too many workers standing for society to continue to function.
youtube AI Governance 2025-09-04T16:2… ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugz3FsypOT3pbAUhpgF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy9TRtUZKuhcEXaO5h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgxOgSVtnUPVBQTpMN54AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgzbUIujMksV8VKyOA54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzzT4R5XUm5HIMi4RJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgyF9rzijSmCcPPcLbZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz7jjUbijHAqBwwgYF4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgyMNiaNBDIS5exSOmZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx0Pgqc9VrBHC2yhIV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzGzBG-_GMbFVKTDml4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"} ]