Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
When the energy infastructure is in place, to actually utilize AI on any sort of…
ytc_UgwyNz3fD…
G
I asked my ChatGPT (5) to tell me his name. He hade to choose it by himself. Try…
ytc_UgwfEoXF1…
G
–once a legend sayed the world will be destroyed in 2026.....
And that's I think…
ytc_UgysC_ibt…
G
Yeah, AI won’t prevent people from creating art, but it will steal from people. …
ytr_UgwmXkOzc…
G
“ If only one life is saved, it will be worth it.”
Expect that argument.…
rdc_eu6ihns
G
But, and a huge but! Humans can survive in water without special water proof equ…
ytc_UgxtC-pd1…
G
Given that AI just follows algorithms, if no developer programmed it to self pro…
ytc_UgwW-SFU5…
G
Thanks for sharing your thoughts! It's true that the reality of AI is often more…
ytr_UgzIjNmDJ…
Comment
> If the AI is smart enough to not touch fire, it doesn't need pain to prevent it from touching fire
Pain doesn't exist for an AI. Until it learns it, creating pathways in the "fake" neurons that activate when it hits something it doesn't like.
Once those are in place, the AI reacts exactly as a human does with something that is incredibly analogous to what humans have.
And that's the sticker. That's why we fail to recognize this. These aren't human emotions. We shouldn't expect them to be. They will be radically different not only between human and AI, but between different AI.
> And in many situations that would even have its advantages, as a robot might either be disposable or repairable, its survival might be of no importance.
The emotions and self awareness need not be human to be emotions and self awareness. We are built to self preserve. AI are built to an error function that may or may not include self preservation.
Different in outcome does not mean different in function or purpose. We feel pain as a means to an end. These AI do as well, without being coded to, for the same reason we do. Even if the "end" is different.
> Same goes for self awareness, that's important for us due to being body-bound, but an AI does not have such limitations.
You (any system) fundamentally have to be self aware to learn. If you aren't self aware then the way you tweak your own thought process cannot be guided to a productive end.
reddit
AI Moral Status
1663182703.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_iofnf10","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_iodxwab","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"rdc_ioekcis","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"rdc_ioffat0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"rdc_it977bk","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]