Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@ProtoAlpha A perfect example of attaching human sentiments to innately inhuman objects. You say humans have an eagerness to deny rights to intelligent life, I say humans are overeager to assign labels to things they don't fully comprehend. Will they ever be sentient in the exact same way that we are? We don't know. Evidence suggests that, yes, it is possible, for everything, except consciousness. We don't even know what our baseline consciousness is, we have no idea how to interact with it at all. Is it a soul? Is it scientific at all? Does it even exist in the way we think it does? No one knows, yet we're already jumping the gun on AI rights. Rights protect living things, like humans and animals. AI are distinctly non-living. If a human dies, that's tragic, they'll never come back, they are forever lost. If an AI "dies", we can literally copy+paste its code into a new framework, and tadah, it's "alive" again. Life and death have no meaning to machines, unless of course we make them feel pain, program them with real human emotions, but that begs the question; why the hell would we do that?
youtube AI Moral Status 2024-09-26T22:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgwhKyNDzzKSWqeO0VB4AaABAg.ALjW8MiXveiAP3kRV5I0Se","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgxMV8zS2MJr75ryjPB4AaABAg.ADPGhjJxqBNAJseeerG9zn","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy-G0wE-OjlkPpGhQh4AaABAg.AAHZhO5ih4_ABYMLFcctKs","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwkLE-cHE2mR9KOj2B4AaABAg.A75jsNKn7poA8rwgiKuofg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwJgxnQmjW5R_J27Cl4AaABAg.A4JjMqT1iOfA4kVARdyrSs","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_UgxSX12oEzSqC8InW_t4AaABAg.A0SOKY5YyH2A0TkqQMzt9e","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgxSX12oEzSqC8InW_t4AaABAg.A0SOKY5YyH2A0ZzxOMRZ3D","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_Ugy4KgyLV178QGu-a6l4AaABAg.A-_Q1zVdbZMA3uIC711cd9","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugy4KgyLV178QGu-a6l4AaABAg.A-_Q1zVdbZMAH6SrliMeKy","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugz_BwUREfrB2zZ3DsB4AaABAg.9v5kC2-j-VU9vx-26LYnFc","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"outrage"} ]