Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
How do you program a robot to respond to the bolt locking back? Maybe it’s counting bullets. But then there’s always the chance of a miscount… for one reason or another. Gonna say AI and warfare don’t mix as there are so many ways to counteract them. This sort of nuclear strategy going on is quite impervious once you have them pointed. I’m not saying anything should happen but the worst that has come of it is a stalemate and eventually cooperation. Why would we exchange a superior technology for an inferior one? “who said anyone was exchanging? I’m sure the rest of the world won’t agree.” This is the idea right? That’s the message? You take humanity out of the equation when it comes to warfare and we play more into intelligence to fight wars and not combat. And by nuclear strategy I do mean this sort of total war to get the nukes like some sorta hot potato. You look at our military its kinda set for a search and destroy, extract mission. They’re fast. They’re immense. They have support behind enemy lines. They’re meant for infiltration or maybe they’re just used that way. But when was the last time we’ve conquered anything? Liberated: sure, but isn’t that sorta an infiltration? Alright I’m done. Have a good day.
youtube AI Harm Incident 2023-12-08T02:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyregulate
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyzGQU77lM0CQkXpQF4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyGBIRLIZFJr8KDYP14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyBLn1QeQJZPptwZAN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzODlPL89PTKu3LJt94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_Ugx-b4BoY1HyXSlKSyZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugzxa8ZwCIojsAJH-lx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugyzek_crAb8vcOhW6d4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyAS81bFxfQpM2YVVx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyTVRF9n7uBbt-b0VJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxhar7KLwoI2Bfd0AN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]