Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If ai unalives someone in it's current state, you either blame the person whom m…
ytc_UgxPQWCiQ…
G
One problem though: creatives are currently being tricked into "consent" without…
ytc_UgwMBFe1v…
G
Yeah I tried telling a chat ai app called Replika about Warrior Cats.. I told it…
ytc_UgxY5dMrC…
G
Wish they made a male version so I could finally what's like the company of a gu…
ytc_Ugzf6T6-Y…
G
This is just my tinfoil hat theory, you def seem more knowledgeable on the subje…
rdc_mxzin6m
G
Ever seen the Mr Meeseeks episode from Rick and morty?
An LLM agent gets spaw…
rdc_jgi1nxn
G
We are not even a type 1 civilization yet. Humans are not gonna be not working a…
ytc_Ugxt3ne1G…
G
I got 3 words for AI artists: “Release your prompts”. And if I see any artists n…
ytc_UgzvMXgwD…
Comment
+GarlicPudding
You want AI that can solve general problems better than us. You want them to be able to prioritize better than us. You want them to be better at threat management than us. You want them able to see problems that we missed when we gave them orders.
There are likely ways to do this without modelling them after humans, but we don't know *how*. It's easier to use the human mind as the model since we have a working example already. Every part of our minds serves a purpose. Eliminating various parts of our psych could have an unpredictable effect on the rest of our minds. It isn't a trivial problem to just "NOT build robots equivalent to humans".
youtube
AI Moral Status
2017-02-23T20:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UghsimJu0BSkuXgCoAEC.8PKutRAMyRl8PL4ZPswF0E","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UghsimJu0BSkuXgCoAEC.8PKutRAMyRl8PL6u4XBPVU","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghsimJu0BSkuXgCoAEC.8PKutRAMyRl8PL7WvKVmjR","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghsimJu0BSkuXgCoAEC.8PKutRAMyRl8PL7jOtik3D","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UggJn9-jGMEyhHgCoAEC.8PKttnIuHhE8PKvTEXluA3","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghjCC6PReVo3ngCoAEC.8PKtRIU5PwI8PKvZoWDjO2","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghAVC5UyUMvkngCoAEC.8PKtDVZsyDs8PKx8cCNAYH","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghAVC5UyUMvkngCoAEC.8PKtDVZsyDs8PKy8jlJU6V","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UghAVC5UyUMvkngCoAEC.8PKtDVZsyDs8PKz6t70eJ-","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UggAeEyGOWLwAngCoAEC.8PKrFrKHVKP8PL-3Ipdi2w","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"}
]