Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
If AI software is stored in physical data centers why not have a controlled explosive on standby next to the server racks. A human operator could guard a button in an isolated room from any doors, vents, cameras, etc. connected to the AI an any way. If things go south the physical brain of the AI software gets manually deleted not by software or a nearby kill switch but by complete destruction of its hardware. You cold argue by the time AI decides to flip we would be so dependent on it that shutting it down would be like removing the electrical grid, however, most of the world is self dependent. Rolling blackouts are mainly just an inconvenience, we have backup generators, power banks, and solar panels. Its impossible to stop things like AI when anyone can get their hands on it and demand is high, but that doesn't mean we can't build safety nets. When our floor fails its in our best interest that we have something to break the fall.
youtube AI Harm Incident 2025-09-10T18:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugzg3_8a3DFjnZTsE0h4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxwoOz1u6NjUoBW7954AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx11KEc-JwBDrIPPyd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz5er2ffUHWiWnYK354AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzCnwUSg2IrEoTK9HZ4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzLcPspYA6TTpRECOB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugypro70lNeHMU7G3xJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgzcEHIkf1puNkcow1Z4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzWy-V0P_fyp5vtBBJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugz1_5wr0el45qkVFBl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]