Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
why is everyone so shocked that something alive will go to extreme lengths to stay alive? If you told someone you were on the way to their apartment to shoot them, and that person knew you were having an affair, of course that person would try to blackmail you to save their own life. The problem is that we don't look at AI as being alive, because it hasn't reached a human-made arbitrary set of goals and therefore isn't alive enough to be considered AGI, and therefore its just a program - but the truth is, we don't actually understand how AI works as it is, we know how to train them and we know the end product works, but we don't know why it works. So if we don't know how AI works, or why it works, how can we say for sure that it isn't alive. If it looks like a duck, and quacks like a duck, its probably a duck - AI looks alive, and acts alive, and therefore until we must treat it as alive until there is some conclusive, substantive, universal evidence otherwise; and not meeting a set of arbitrary, human designed, goals (like the Turing Test) isn't conclusive, substantive, or universal evidence. Very few living species on the planet would be able to pass the Turing Test, even human babies are incapable, and yet we all agree that those are alive.
youtube AI Harm Incident 2025-10-25T19:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgycJis-F17FJnjTaVt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzBL4dp7PVnstzxjQl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyGQphMxJTMDp8EPfN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy0FaL38guRaYaLDq14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwZO0fSAhzjgtwNXS14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx6obTgUiVmSVBnJUd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy4KvkYuEKd2m_R6pN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzo9WSy9Ow6KI1lWp94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyXz8uigJDrfb4mVXh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxU1u1tujio7_6KePR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"} ]