Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We use our youth to build the unthinkable, funny how that works as they know you…
ytc_Ugy2h6jlS…
G
No jobs, no taxes and no consumption. No taxes or consumption, not money for AI.…
ytc_UgzYg0IvY…
G
We don’t even want a capitalist but if we have to have it we’d at least like one…
ytr_UgyiDbRRA…
G
It is a good use! You are using it to generate ideas, not using it to generate i…
ytr_UgxUSC_ah…
G
@jefflewis4 - autonomous vehicles will not need anyone watching them. They will …
ytr_UgzVVa4Ht…
G
Money will only keep circulating and being useful for the world's industry leade…
ytc_UgyDrjoXI…
G
@dogsuit an AI will never not be a machine - and machines do not have the capabi…
ytr_UgyT7jrUo…
G
Whens the "tech Billionaires run a successful business without depending on gove…
rdc_m6yug8y
Comment
I don't have much philosophical background, but one thing I'd like to point out is that the moral boundaries we set are absolutely tied to what is practical.
In an ideal world, we would let every living retain all their freedoms; they can do whatever they want and would never have to suffer. No person, cow, or bacterium would be killed. The problem is that certain rights that some living things have will hinder the rights of others. Until very recently in human history, we could not survive without eating other animals ^[[source](https://www.amazon.com/Catching-Fire-Cooking-Made-Human/dp/1469298708)]. We also can't help but kill millions of bacteria left and right regardless of the choices we make, since the normal behavior of bacteria (multiply if you can) basically assumes that a good number of them will die. In fact, we can't really compute which of our actions would kill the least number of bacteria without devoting our own lives to this task.
Practicality manifests itself in more subtle ways in our ever-changing morality as well. Modern medicine would have never gotten a start without rather cruel experiments centuries ago. We now have machines that automate dangerous tasks ([defusing bombs](https://en.wikipedia.org/wiki/Bomb_disposal#WWI:_Military_bomb_disposal_units)) or make them much safer ([building tall things](http://www.allposters.com/IMAGES/ISI/I-BC002.jpg)). For all of these kinds of tasks, the original way of doing things is now immoral or unethical simply because there is a much better way to do it.
If we somehow lost all our technology tomorrow, would we sit around and do nothing claiming that doing anything is unsafe? No, we would continue to build bridges in the old, dangerous fashion while we search for ways to make it safer in the future. Similarly, once we find a way to adequately teach biology and medical students anatomy without using real animals, dissecting live frogs will probably become unethical rather than standard practice.
If you
reddit
AI Moral Status
1483321053.0
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-25T08:33:43.502452 |
Raw LLM Response
[
{"id":"rdc_dbvye2t","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"rdc_dbw10dz","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"rdc_dbvvhl5","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"rdc_gn8wmyq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"rdc_nvqkft9","responsibility":"none","reasoning":"none","policy":"none","emotion":"outrage"}
]