Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That scenario is just unimaginative BS. It’s idle speculation based on precepts that themselves needed closer examination. The main one is that the US is the tech leader and that China would be “two months behind”. China is probably already 2 years ahead, it’s just our decades of propaganda and self-belief wont allow us to accept that fact. And - if AI is to be the future - tbh I’d prefer that it came from China, where at least they have some idea of regulation not just rampant profiteering. Maybe Chinese AI might be less out of control. It’s true we are f-ed by AI but not because it decides to go to war. It’s simpler that that. Right now we’re building power and water greedy data centres at break-neck speed during a climate crisis where resources are scarce. The consequences of that will probably do for us before the robots. The collapse will probably come cos we no longer have enough power and water, nor the ability to live without tech that is wrecked as a result. Humanity and AI will die of a lack of knowledge, not more knowledge.
youtube AI Governance 2025-09-08T07:2…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxBy0BqvXo-NO5ujtx4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgwafRQGp_U9oB_WRPB4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgwlMfIiUfJMJ3oXgPx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugz6zsUcLDLI-Iu8Fpl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxbFB02o_4bN_meXeV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzyb6FqrpO3hfIa1Xl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzKn4rOdSAP_NomJap4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyRgt23SjhfgfNr3W94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwYSbdW5qMKOcMvvQd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_UgxQKAxvyfgPugyZ50x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]