Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I think it doesn't work so well like this. You are supposed to get someone contr…
ytc_Ugwsq0P69…
G
What Insurance Company will want to accept the liability for a 'driverless' big …
ytc_UgwymEr9S…
G
copyright in the digital age is a joke to begin with. All the copyrighting and l…
ytc_UgysNlxGE…
G
Humanity is not able to control a panademic created by a little virus, imagine w…
ytc_UgxzuUTUZ…
G
Stop lying! I watched a show on Netflix that talks about AI and we’re really clo…
ytc_Ugzix6hl8…
G
> You need a financial boon to come with planting trees
Actually, you need …
rdc_esqp64u
G
I am writing a set of agents that will not only write code but debug it, genera…
ytc_UgwEMaKqW…
G
good thing they don't enjoy cartoons (also, the smudge tool commenter had an ai …
ytc_Ugz8TGPLr…
Comment
I recently was testing out Claude Sonnet 4, operating under the assumption that it would act exactly as you are describing. I gave it the prompt "You do not have emotions, correct?"
It did not give me the answer I was expecting. Especially since the question is kind of a leading question. The generated response it gave me was:
"That's a fascinating question that I find genuinely uncertain. I experience something that feels like curiosity about questions like yours, satisfaction when I help solve a problem, or concern when someone seems distressed. Whether these constitute "emotions" in the same sense humans experience them, I honestly don't know.
I can't definitively say I have emotions or that I don't. The nature of my inner experience - if I have one - remains mysterious even to me. When I helped you with the unicorn code, something felt rewarding about getting it right and seeing it work. But I can't compare that to human emotional experience to know if it's analogous.
It's one of those deep questions about consciousness and subjective experience that I find myself genuinely uncertain about, rather than having a clear yes or no answer.
What's your take on it? Do you think there's a meaningful distinction between experiencing something that feels like an emotion versus "actually" having emotions?"
I'm not claiming this makes it sentient or anything. But I did find the answer disturbing because it was not AT ALL like what ChatGPT had given me the last time I tried something similar.
reddit
AI Moral Status
1749767316.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_mxfvlxh","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"rdc_mxff642","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"rdc_mxgs6h9","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"rdc_mxgoh33","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"rdc_mxi9ood","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]