Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's a computer, it can only do whatever it is programmed to do. This sounds exactly like they are using AI to take the blame for their crimes committed by humans deliberately, but are using AI for an escape GOAT. I don't buy into this computers thinking on their own, it is literally impossible for wires and chips to think, it has no conscience, but people who commit these crimes have no conscience either, they are unable to reason in their thinking, but they do believe that telling a lie long enough will eventually be believed, at least by them preaching it anyway. Unable to convince God, they simply attempt to overcome their victims through exhaustion rather than logic. While computers run on logic, they are only as good as the programmer. Still, they can only protect themselves even, to whatever degree they are programmed. Like programming one to shut down when temperatures reach a certain point, or switch to another program when data limits have been exhausted, or ask a question when no data is found in order to proceed. Only as good as the programmer. Programmer crimes, not computer crimes.
youtube AI Harm Incident 2025-07-29T09:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxzNP_6sPbF2rqvJip4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgztwTVY1ewYX748Rq54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgwEMsf9IdR1cik0V9R4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgzKYVs0sQ42iNQft5d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwV3vHBwkm4gCcjXhN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzsN_dey8H8EPiEJKp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxjE5wF-fIjZKhHx2d4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy-SlwarJtG0KfeyKR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgyFqs3KogUECPoa0S94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw4gNAPK-VcOTAVxcN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"} ]