Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is at least one flaw in the reasoning. If greed is the driver of the development of AI, the producers have a very big problem. With 99% unemployment, nobody has the income to buy any products and businesses will go bankrupt. So you would have to consider taxing the businesses so we can give people a reasonable income that ensures their consumption, and not just food. So this kind of effects would mean our government needs to wake the f***k up. But they will not. They are not even talking about it. They are talking about a non-issue in our elections here in the Netherlands right now, ‘immigration’ (and not the work related kind, of jobs we don’t want to do anymore, but, well, of people that don’t look like most of us here, people of color). So society will collapse? I am affraid that is a very real possibility. But, to pick up my previous argument, why would the businesses and governments that drive AI, push it, when it destroys their business model? I just don’t understand. I guess just because they can?
youtube AI Governance 2025-10-13T18:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionunclear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyEeJtxCZ0L-TXIde14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwT09jnHKTG6Gu3PYB4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugylb8WW0PXHFyEdnV14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxLG0pOr9_Y5z3WK-R4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzjMOlO3zDnO19WwIN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"unclear"}, {"id":"ytc_UgwAf64yfuY-eHjoLYR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugwkg_gMEmym_iBYA4x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxmDj5bf7UKvuDTfQd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyS2l5Z55euxgX6HzZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw4__W3jlcNOqDpsKl4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"} ]