Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think the largest imminent risk is about the increasing inequality of wealth in society that AI brings. We need to start thinking of how society with less need for human work / labour can function. Ultimately, I think that modern societies have to increasingly adopt "socialist" policies. For example: how about a taxation of companies based on the ratio of headcount to earnings. This way, those companies benefitting most from AI replacements and automation, will also have to pay more in order to contribute to society. Funds from these taxes could be used to finance base incomes that in turn fuel consumption that the economy needs. Of course, any of such efforts would need to be agreed and implemented on an international level. I know this is pretty utopian, but from my point of view, this could be a helpful piece in the puzzle.
youtube AI Governance 2025-07-30T13:4…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz1-KLuEjgmBpnIcKp4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugwc0fo2VpkjTmwtbER4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzFLnBjv0_QcEwijWh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzgpJRxGLBL85UexzZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxMv3XslB_HzdzeZoZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzPxZrg-S2Nr4OEqPB4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwhPgl1gw6Xhec3Zax4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugydz8nmc15MutQqEjh4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxdD0SZyeqS9hnSs2J4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz2w7Ea5jVkn4fGBbx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]