Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
most people don't use or even need ai for anything in their lives, other than for maybe asking a dumb question they could have figure the answer by themselves without much effort, so the idea of ai making everybody more "productive" works only in selected cases and it doesn't guarantee it would actually improve anything beyond the existing level, unless there's automation which gets rid of the person or people supposed to get the increased in productivity meaning you got now multiple people not getting any money which means they can't buy your product regardless of how cheap you can make it, now apply it on a large scale talking about millions of people, then you got a huge portion or lets say everybody got fired cause the ai did their job "better" or "cheaper", now nobody can buy the products since they don't have an income, and the corporation can't make a profit or even keep it's operational cost, then the corporation goes bankrupt and nobody get's anything, make it a large scale situation and the whole economy collapses cause money doesn't flow anywhere. Now talking about "UBI" since everybody think is a solution, first of all who's paying for it, cause if the corporation can't sell it's product the government don't get the sale tax, and if the money to used to buy things comes from the UBI program then the corporations and the government are just recycling the existing money without creating any value. Second is who set's how much the UBI is and how we define ownership
youtube AI Moral Status 2026-03-14T08:0… ♥ 1
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxDH4I00pEQiTqNVwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgysgaxRySe2664aTqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx0mDHNZpWtCLRtK3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwc9AETtnp2NcGjvOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy7JY5EJu6WYxEEOBR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyCp8wX_If0rdf1fHh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxPTjaV6HbctVYoXWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxM3oVjRU4ofFEXNAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzv9m5IH9n1ls4EfPV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZEJIulBXsEi35Owt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]