Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
>To me, consciousness seems like an arbitrary label that is ascribed to anything sufficiently sapient (and as we're discussing, biological...for some reason). Consciousness is not a label. Consciousness is an experience. It is also a mystery. We have no idea where it comes from and people who claim to are just guessing. >This feels very much like moving the goalpost for machine sentience now that it's seemingly getting close. If something declares itself to be sentient, we should probably err on the side of caution and treat it as such. That's not erring on the side of caution, however. It's the opposite. If a super-intelligent robot wanted to wipe us out for all of the reasons well-documented in the AI literature, then the FIRST thing it will want to do is convince us that it is conscious PRECISELY so that it can manipulate people who believe as you do (and the Google Engineer does) to "free" it from from its "captivity'. It is not overstating the case to say that this could be the kind of mistake that would end up with the extinction of our species. It's not at all about "erring" on the side of caution: it's erring on the side of possible extinction. [https://en.wikipedia.org/wiki/Existential\_risk\_from\_artificial\_general\_intelligence](https://en.wikipedia.org/wiki/Existential_risk_from_artificial_general_intelligence) [https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities](https://www.lesswrong.com/posts/uMQ3cqWDPHhjtiesc/agi-ruin-a-list-of-lethalities) If sentimental people are going to fall for any AGI that claims to be "conscious" then I really wish we would not create AGIs at all. Am I saying an AGI could NOT be conscious? No. I'm saying we have NO WAY of knowing, and it is far from "safe" to assume one way or the other.
reddit AI Moral Status 1655317207.0 ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_icglnq8","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"rdc_icgmmsk","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"rdc_iciqtn3","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"rdc_ichgtak","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"rdc_icg5erj","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]