Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The image of the beast in the Book of Revelation, whom kills anyone who doesn't …
ytc_UgwPArF0K…
G
Nothing new under the sun,,what is happening has happened before,,all but dust …
ytc_Ugx55aVAp…
G
They get the same results while having the AI learn exclusively from its own cre…
ytc_UgxL-GYMD…
G
Selwyn Raithe wrote in 12 Last Steps that “AI’s final limitation won’t be compre…
ytc_UgyRdKWwE…
G
We've been teaching AI for a while now, every click of a button, every interacti…
ytc_UgwatZ_9S…
G
It's sad with a real woman who lives off you and dreams of a better option with …
ytr_UgyK3itbZ…
G
And how do humans generate coherent sentences? Once we've been 'trained' to use …
ytr_Ugw0R9Mf0…
G
ai "art aint art because you arent doing jack shit! you can only call yourself a…
ytc_Ugw38EjEN…
Comment
Something worth bringing up is the question of "fun" or "What the robot's want"
The concept of suffering and rights from it is an excellent question, but it is only one half of the question.
For instance say we programmed the robots to have fun mining, to get a shot of excitement every time a cave in almost happens, and to just find joy in sorting one type of rock from another.
Maybe they enjoy it so much that if the robots were left to their own devices, they would, by choice, started to mine whatever they could all by themselves?
Would the robots then demand the right to mine? Perhaps even be willing to take up jobs cleaning the environment so that they can go home and mine for a few hours each day? Could this even be called a 'Right'? Its what the robots both want and demand, but dose the fact that we programmed them this way make it less than the rights we ourselves are programmed to want and demand?
youtube
AI Moral Status
2017-02-23T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | industry_self |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugju7aEfGYlMBXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugjeziu4V1EknXgCoAEC","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgiLWOcRt89jfHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Uggqw-SfwBxqHngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UghD-anJqaf-jngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugg2MtUBRNtZ9ngCoAEC","responsibility":"developer","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UghG18WWY7H_q3gCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggNgQ_Hy9w9vngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgjaMZKvJE3S4XgCoAEC","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugi411ebTWTvlXgCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]