Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These people have no idea what they are saying. AI is here. It’s replacing high …
ytc_UgyC3xQRa…
G
If He would tell robot I will destroy u... what would Robot's recation had been?…
ytc_UgzStArzj…
G
That's crazy. I remember when I was a kid and future predictions would always us…
rdc_euezw01
G
Unfortunately only enough, not most, need to choose AI. And so far they have! Th…
ytr_Ugxhp-Dhw…
G
Entering High school with this huge boom in Ai tech makes life so much easier. T…
ytc_Ugww4hrfz…
G
Its simply shifting the jobs, its a common occurrence, yes some artists will pro…
ytc_Ugxy5SQNO…
G
Sam Altman
Too much for someone who call raising a kid is more energy consumpti…
ytc_UgzU43D1a…
G
I don't comment on YouTube videos but I needed to here. I spend every day of my …
ytc_UgyEFQOca…
Comment
Sigh, another commenter who assumes consciousness is defined solely by behavior.
> Okay. What behavior does an LLM need to show so that you would admit that it has the capacity to feel, want, or empathize?
Is emotion just a behavior? When you experience emotion, is behavior the only result? Clearly not.
> If you don't assign the ability to feel, want, or empathize on behavior that someone or something shows, what do you base it on?
Just because you cannot think of a satisfying qualifier of emotions/feelings/etc. beyond the resulting behavior, doesn't mean there isn't one.
> Human memories are just weights in neuronal connections, and not "snapshots of experience". But fine.
That is as reductive as "LLMs are just a matrix of numbersl" or "computers are just 1s and 0s". All of these statements, including yours, are so reductive that they are essentially meaningless.
> When weights in a neuronal network are "snapshots of experience", then any LLM, whose whole behavior is encoded by learned weights in neural networks, is completely built from memories which are snapshots of experiences. Wait, the weights in a human neural network which let us recall things, count as "snapshots of experiences", while the weights in a neuronal network of an LLM, which enables it to recall things, do not count? Why?
In addition to your entire premise being overly reductive, this is a complete misunderstanding of how LLMs work. **Weights in an NN are never used to "recall", a NN does not function as a memory system.** The closestthing weights in an NN resemble is intuition - they directly control the likelihood of certain tokens emerging in the pattern of the output, given the pattern of tokens in the input. **You are idiotically comparing this to the encoding of memories in humans!**
> And you write about your consciousness because it's real? How is your consciousness real? Show it to me in anything that isn't behavior. Show me your capacity to feel, want, or empat
reddit
AI Moral Status
1739931766.0
♥ 3
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | outrage |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[{"id":"rdc_mdjclr9","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"rdc_mdjn4xp","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},{"id":"rdc_mdjiq5l","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"rdc_mdilzp3","responsibility":"company","reasoning":"mixed","policy":"unclear","emotion":"fear"},{"id":"rdc_mdio93r","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}]