Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It was always going to happen, but it’s hitting the tech world harder than most …
ytc_UgwbpIHU_…
G
No.
A.I. is a human creation.
Therefore, nothing mankind creates can ever fully …
ytc_UgwTsrNRT…
G
The way i see it more automation is better. But like always is not a tool proble…
ytc_UgzKrc3_M…
G
Just as Will Smith stated in "I-Robot"
"Lights and clockwork"
Life is special,…
ytc_Ugz-SybTU…
G
I enjoyed your commentary. Everything you presented is sound and relatable. Th…
ytc_UgztOKafk…
G
LLM should never have been named AI. It is the opposite, it is AU for unintelli…
ytc_Ugw7Ps1_r…
G
Not me writing a book without using AI, with ADHD who still supports the use of …
ytc_UgxsGH1FK…
G
It is always cheaper to buy a cheap car. While most people actually want to buy …
ytc_Ugy_6aFmU…
Comment
Thanks for the feedback, I edited the OP to try to address some of your criticisms, I'll do it again here just to be more explicit.
1: Any given person must have greater, less, or equal moral value to myself.
I define morality as applying to any actions/thoughts/etc which affect the well being of sentient beings. In other words, the very ability of a mind to have and experience desirable or undesirable sensations is what gives it moral value. A rock or wholly unsentient thing like perhaps bacteria has no moral value; as sentience increases, so does moral value. We humans seem to be the most sentient beings that we know of existing but this morality could certainly apply to any form of sentience, such as the more intelligent animals, or in the future to self-aware AI. How sentient non-human things are seems to depend a lot on science, granted.
2/3. It is logically inconsistent (e.g. false) for any person to have greater or less moral value than myself. Therefore, everyone has equal moral value to myself.
As per what I just said above, and also per 10), this isn't actually what I'm claiming. I'm claiming that we should assume by default that other persons which appear to be mentally (by which I mean the capacity to experience desirable and undesirable sensations) about the same as us should also be default to be considered morally equal to us. For other persons (by which I mean any kind of sentient mind) that are obviously different from us, this is not necessarily true. For example, science would seem to indicate that while dogs are intelligent/sentient, certainly moreso than ants, they are probably not as sentient/intelligent as humans are, so actions that place a higher moral value on humans than dogs seems to be justified. I think you could make the same case for obvious sociopaths/psychopaths. Individuals that have no capacity for empathy or who even take pleasure in the pain of others have an obviously different capacity to perceive or desire posit
reddit
AI Moral Status
1415090450.0
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | utilitarian |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_n8jknk3","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"rdc_n8j76rx","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"rdc_n8jdfel","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"rdc_clrt2bh","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"rdc_clsif6k","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}
]