Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Lots of scariness which mainly comes down to this idea of a race for AGI/ASI in order to gain dominance over everyone. I'm not bought in to this idea that whoever gets this first becomes the dictator of the world and then the AGI/ASI itself wipes out humanity because it sees us as a threat or just like ants that don't matter in their schemes to complete their objectives. We have no evidence that AI "thinks" in a way that suggests "intelligence" or self-awareness. They simply run programs, and those programs do an amazing job of emulating conscious, thinking entities. But they aren't, at least as far as we know. There's this sense that an ASI will simply demolish us, but it has to be given the ability to do so. If we give it the priveleges to large infrastructure systems, power grids, nuclear weapons, etc., yes, we're basically asking for extinction. My hope is that no person/leader/government is that dumb to hand over the keys for such critical system to a system that we can't know exactly what they will do. But yeah, humanity doesn't always make the best decisions, and we just need one leader who ignores all the red flags and hands over such keys...
youtube AI Harm Incident 2025-07-25T12:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugy9XNSUXjNzQn8zObR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgytTT0TCIdfNt2Boet4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwIMyfMe7BUxFpVNoJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgyxC04S3pjJKno-is94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxwZEPAlypo48kmP1J4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzZrdF86LwtOZ5-gMJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgwkClpuXiq1jogGykx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugy4-g20X8iH-pV_Cm54AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgweBuj-dO_0rL9s2Bx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxVA0R1QZwmyEhM4M94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]