Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Are you worried about losing your job in the AI Revolution when they begin to take over our jobs in masses? Some say this won't happen, companies will only use AI to augment their workers. My question is, why would businesses leave 80% of the benefits of AI on the table when AI is, and by the day, becoming more and more capable of replacing the human workforce? Some say the reason is that businesses realize that full adaption to these benefits, replacing the majority of their human workforce would only cause a major economic collapse and this in turn would cause the collapse of the very businesses that caused to economic collapse in the first place. But you and I know the GREED of American big Businesses, don't we? First one will begin replacing their workforce, and then another, and another, realizing they can't remain competitive if they don't. Then businesses realize that if they don't replace theirs before the government steps in and stops it, they will need to do it quickly. Before you know it, the economic collapse is here. One greater than the "Great Depression". This is what my new book is about. complete a - z plan on how to deploy AI into our workforce in mass, displacing the human workforce as much, and as fast as it can and NOT create an economic collapse while benefitting both businesses AND the populous, actually creating a future of abundance all proven mathematically, displayed in tables that make it very easy to see how the "Platform" works. Available through online book retailers in March, The New Social Contract: AI & The Future of Work.
youtube 2025-01-31T04:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[{"id":"ytc_UgyZ-5WM8jJgOhiou2F4AaABAg","responsibility":"company","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzZA8xVsShgPubffoV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzezL2bxCD7bsCvURR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwt52w1kDYGHvUj1rF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy6lLbjbiIiZcZLO454AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwfi1coa163N7YDuO94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxLuzySUUH4JVWiTeV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxX9kfN75FsJDyHZpp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwxp75_3aFPqqv4ykJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugzmam-XfQmTITmFt9x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}]