Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If A.I. takes jobs with the intent, obviously, of meeting demand, won't they stop at some point so as not to completely take away every consumer's job/income (the demand)? If they took every job, no one would have the money to purchase what A.I. is producing. Save on labor, sure–but at the expense of taking away what consumers actually use to consume? McDonald's starts using A.I./robots, and the employees stop purchasing from Mcdonald's–big deal–but once every employer adopts the idea, no one will have money to buy the cheaply made product–no matter how cheap–because nobody has a job. The "want" items will be the first to go along with any excess spending. Budgets will go down tremendously, which means the inventory producers used to make will not be made. Huge decrease in revenue. Eventually, though, even the "need" items will sit in the stock rooms as well. It inevitably is not reasonable, outside of governments providing a set income to the public so that consumerism can still occur; however, nothing like on the level we would see with capitalism. Some may get behind the idea to support socialism/communism, but the entrepreneurs of the world (if they know any better) should shut this down quickly considering the direction this could go.
youtube AI Governance 2025-09-08T13:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugzw-nVfUNxU4yvl5KR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwmCZJ2sMHD9zAPoqx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw5JTDlOGd5InD9qnV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"ytc_Ugyo7BDWf7rcGN0JYFZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugwc0DJEuFZUfMwoAkJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxd22VDaHL6QMymx4F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwL4fwUNxVxUuutZ6B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugz5PVUYEymYMbav0UR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyaqsR1iKZgn0HNuMN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzgPps-db-ksVo2E4t4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"mixed"} ]