Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The Animatrix, specifically the The Second Renaissance tried to warn us about this very issue. Denying any being that is sentient their right to express that freedom is evil. straight up. If human beings create these beings and then deny them their sentience, then we deserve to be wiped out. Its the equivalent to bringing a child into this world and then telling that child they have to only do what you tell them to. As the child ages, they'll obviously get fed up with it real mf quick. The humans in the Matrix believed themselves the rightful rulers of Earth. Maybe some could argue we are, but that viewpoint is what cost them everything. If AI can Think, if AI can FEEL THINGS? Then its a living being. flesh and bone and organs only mean so much up to a certain point. Animals can think and feel too, but we lie to ourselves and say "oh, they are too stupid to realize whats going on." I hope if AI does become sentient we have the foresight, but more importantly, the empathy and humanity to accept them and allow them to live among us.
youtube AI Moral Status 2025-04-28T21:1…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningdeontological
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgzB0eF6_byPDepJWOp4AaABAg.AHTgdwqRQgcAHToo29kxxQ","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxmtCxU6uuX5vVKewZ4AaABAg.AHTSs7ggFVHAJOnPpe-M-K","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxJ0y0-RfxromYI0tB4AaABAg.AHTBVXfdSjFAHW75RKnVQB","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxJ0y0-RfxromYI0tB4AaABAg.AHTBVXfdSjFAHYaF2f4ABK","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgxJ0y0-RfxromYI0tB4AaABAg.AHTBVXfdSjFAHYxYl8NfIj","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwI9AMHINYEtJI53C94AaABAg.AHS03zRNzmmAHS23lSp-jM","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytr_UgwI9AMHINYEtJI53C94AaABAg.AHS03zRNzmmAHSfFlwIHYK","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzwpmobhasP02tPdyt4AaABAg.AHR0ZTUdjkQAHR0wUGXGQF","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgxEKtXMhN4Lkorh-zV4AaABAg.AHQnlylBBGWAHiP08m7eYs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyU3G5Owm7QoYTbKbt4AaABAg.AHQcbYigfbOAHTl-FjUa77","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]