Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI is the new arms race. AI has no allegiances. AI is only a mirror reflecting the imperfection of human beings. Trillions have been invested into AI which could have invested in clean water, end hunger and homelessness for the entire world. This race to AI supremacy is costing the planet carbon dioxide emissions as super data centres are using more and more electricity generated from fossil fuels and no one is objecting to this. This existential threat AI poses is only to people who are buying, using and believing what AI spews out. We are getting so use to letting AI think for us is the real danger since relying on something will shrink our cerebral cortex. In three generations the human brain will have shrunk to such an extent that is the real existential threat. The political system is fabricating external threats to justify the trillions spent on WMD and AI. AI as a weapon will be reality in 3 to 5 years. The more people are for AI the more I am for returning to a natural life style depending on nature vs AI. This reminds me of Frankenstein where the creature didn't do what its owner wanted but what it wanted. It also reminded me of Terminator where the AI system Skynet decided humans are irrelevant and needed to be exterminated. Mankind has always been short sighted in making and giving power to an intelligent machine.
youtube AI Governance 2025-12-06T02:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwqXQkkWYZGaAelJZ54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyOyNy4gDWTwAqQW6t4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyFesVlT5XDKHGdSx14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy89kWQT5Yk2cUZ6_94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugxm_cyesVJlMTuOmV54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxhuyGA2TR9dILxETp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyjTow-SWVzcguO8Et4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugx5nVBu8JWhj3rb0HR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzepNdgbaOuvL9uY_Z4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwA2ILGwZrBAPHc-D14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"} ]