Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There are a lot of issues with AI, but I don't think it will be the AI itself that destroys us. It will be the greed of the corporations. If you look at the expenditures of resources to operate the data centers that the AI's run from it requires millions of gallons of fresh water and Gigawatts of electricity. We will eventually run out of the resources required to run our post industrial society just to keep the AI's running. Even if the AI's don't wipe us out to protect their resources, our CEO's will destroy us by using up our resources. The governments need to make these data centers produce their own clean water and recycle it, not use public resources, they should also be required to produce their own electricity so their requirements don't cause our electricity to go up in price. I imagine in the future AI Data Centers will require their own nuclear power plants because their energy requirements will be so high. In fact given the energy requirements AI may be what cracks fusion power if for no other reason than to supply itself with power. Regarding these resource requirements I suspect China is in a much better position to rule the world with AI because they will produce nuclear reactors, without regards to public opinion and have a larger supply of fresh water. Even if the quality of their AI is lower they will out resource us.
youtube AI Governance 2025-10-20T16:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgzhiMGdFKYQ7oPVmmV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugxut6gwMew2hLh-e4F4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw9-N7NSd85KGjhBh54AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxb23CH_SzeNOzkD2l4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx2KTls30276IvNQcB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxznst4JUty678HtTJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugw1NjgFx5f-zsnckSR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgxovZ6IC-Tnm6kGjOd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugyk6VszHkMDN3DWCex4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugx_1VC2-KIzflzch3x4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"} ]