Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great conversation. The precedent is that things that have no smarts have historically caused more problems than we can ever imagine in the second order. Even here, the conversation about taking over is short sighted in just how easy AI can "take over" or disrupt with or without the help of humans. Disrupt the flow of energy, fuel, food or money and people will turn against each other in less than a few weeks to some extent. Have you seen the panic in supermarkets when toilet roll becomes scarce! Imagine global shipping and distribution, payments systems, banks, power to homes is bork'd as it totally relies on digital connectivity. Imagine a new covid every month! A subtle amount of manipulation or disruption has huge consequences that we aren't talking about. Both people-driven nefarious actions and later agentic collaboration will have immense second order effects we cannot even contemplate. The cat is out of the bag already. Extreme caution is essential - Look at the prep we put globally into the "millenium Bug" and in comparison what remediation and protection are we putting in place in all major primary essential industries. AI protection consultancies will be a huge boom for all industries.Sadly the AI can already preempt every countermeasure.
youtube AI Governance 2025-06-16T22:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxgGa-GwbHM7UPWOGF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyj9M4wI6L-f9bvyxN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzJ_-EzAEaCTRTyaUB4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgymND7Ep_kVEFF_iIV4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"}, {"id":"ytc_UgzqtvxRoYtL85JOsB14AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgyUK0ANWNww_ShCwdJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytc_Ugwet04hm66O3mMvXoJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugyo5CJpdcYCrgZ5_NZ4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyrv-4VQDp3lXxuTml4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgwtgZOkac5hB2payBN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"} ]