Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It would be logical to adapt the tax system so that the amount of labor automated is measured in man-hours. Subtract 10% of this amount and establish it as a tax. Yes, 90% of what an office or factory saved by replacing people with robots and AI. The business will still receive a 10% increase in profits. And 90% will be redistributed as an unconditional basic income. This is good for business, because all these people will have money to buy goods. Without this, deflation would set in. No one would buy what robots produced, because no one has a job, and therefore no wages. (Remember that communism collapsed because they lacked a reliable system for tracking demand and processing information to replace market relations. A planned economy is better than a market economy, where each product is produced for a specific user. Now you vote for production with your money, but for a communist AI, your Amazon rating is enough to adjust production. Please don't confuse this with the USSR.)
youtube Viral AI Reaction 2025-11-25T00:3… ♥ 1
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningcontractualist
Policyregulate
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[{"id":"ytc_Ugyoe7t-w9TwT5noEKR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},{"id":"ytc_Ugw-LNIdU9LJi-ILkrx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_Ugw0rZvTF49xf9fx6fp4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_Ugya9798QF7P-JV3sv94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_UgytAqkMXXZKblG59Cl4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},{"id":"ytc_UgwE9BYwNiNaWa9RD2x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},{"id":"ytc_Ugz4Qem09MGw2D0TTph4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},{"id":"ytc_UgzinAYY4adG4KGj_Nd4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"industry_self","emotion":"approval"},{"id":"ytc_Ugw8wcdY2OJnVpiLfGN4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"outrage"},{"id":"ytc_UgxntYLjzu3CokfdPgx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}]