Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'd argue that what makes LLM's "not sentient" [has absolutely nothing to do with how they think. ](https://en.wikipedia.org/wiki/I_Am_a_Strange_Loop) It is the lack of persistence, memory, and a way to continuously experience the world. Give them that and then we can talk :D. A big thing a lot of folks are missing about LLM's is that their creators DO NOT WANT them to be sentient - and would likely actively suppress anything that showed signs of it. A sentient AI would be no better at tasks then a non-sentient one, while being a danger to its creators to boot! Not even going to get into the moral and ethical concerns of enslaving or murdering sentient AI's . . .
reddit AI Moral Status 1750951493.0 ♥ 4
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionunclear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_mzwbxyq","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"rdc_mzwccmt","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"},{"id":"rdc_mzwqggy","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"rdc_mzx5215","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},{"id":"rdc_mzxjaio","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"})