Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The self-driving car should always stop before hitting someone else.
If the car…
ytc_Ugg1rrdyz…
G
I don't see AI going away anytime soon. I see it becoming a part of our lives mo…
ytc_UgxEROAOR…
G
Surely emotion is the basis for true rationality? Without emotion they will just…
ytc_UgwZ03JxS…
G
Imagine your human consciousness stuck in a cyber world forever because u was ri…
ytc_UgxoEKImE…
G
In the next years if you don´t know about AI, you will be old fashioned and unem…
ytc_UgwU1e-D2…
G
Im pleased they put "robot" on the video at the start so that I could tell that …
ytc_UgyCxhMyE…
G
This is precisely why I stopped using Google Photos. Every last one of my photos…
ytc_UgwxXSaeu…
G
Ask your AI how Enki and Enlil, Gaia and Uranus and the Greece pantheon connect …
ytc_UgyTwy-7t…
Comment
I suppose robots (or AI) will need some sort of rights that will give them opportunities to live in our society, but they would probably depend on their programming. And also, we need to think about different outcomes, cause robots can be hacked, or controlled, and they can also start some sort of rebellion, thinking "We are superior to humans, we dont need them!". This topic is very complicated, so it's difficult to discuss it.
youtube
AI Moral Status
2017-02-23T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgiLDZDsluuX7ngCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UghO27xPtF4OL3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgiyMwZ_7WU5mHgCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugh-nIhLVlynuHgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugh6GzVlcqfQxHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Uggd7HuqJgAx-XgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiVAEnmcJsth3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"resignation"},
{"id":"ytc_UgjS4PQpHaKB33gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgjZof-spcqFxngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UggrO82HB4K0HHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]