Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They can ban all the laws they want and AI still won’t work. You can’t make a li…
ytc_UgyDbkavF…
G
@cendresaphoenix1974Yes, the Machine, the people Only says something, and the A…
ytr_UgzrZkY49…
G
The strain sounds like a Transformer at the same time self driving cars are show…
rdc_hm7kuxm
G
I downloaded one tat was from an artist who recreated it as a Coraline reference…
ytc_UgyMfwzjp…
G
The way AI struggled made me realize how easy it is to set up workflows with Pne…
ytc_UgyvhAP67…
G
If you make robot for killing enemies, they also uses robo to kill you, so killi…
ytc_Ugz95hy1g…
G
Why push A.I onto a sector where it will eventually put talented people out of w…
ytc_UgzOHn5Wb…
G
That's the most stupidest argument I've ever heard. With AI, all you have to do …
ytc_Ugy27w_Jw…
Comment
But thing is robot's cannot feel pain, they can not be sad and they cannot have feeling. We can program robots to make us feel like they have feelings. A robot is just a program, there is no need to give it rights. If it is asking the question do I have a soul, it was programmed that way. If you ever delve slightly into computer programming you will realize why A.I. is a not sentient. A computer also can't want, it can make us think it wants, but that is all it is, makes us think. Going back to something mentioned in the video about pain, when are body is damaged we are punished, when a computer avoids damage, it is because we wanted it to avoid damage.
youtube
AI Moral Status
2017-02-23T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytr_UgjpVgvcSYi_hHgCoAEC.8PKKTJFWyhd8PKP7m7X7Uw","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_UggwyeccZd3bXngCoAEC.8PKJqtviChh8PKKYXWx0tZ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytr_UgjmJdDxrntxfHgCoAEC.8PKJILnWN7U8PKOCYlNmgz","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_UghWX13cdrU353gCoAEC.8PKJ8VyCw3A8PKQNqc_JFs","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UggwPCXEgEoEP3gCoAEC.8PKJ3LAfsg38PKP3vYm1zg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugjt8wMd7spRm3gCoAEC.8PKImtcOhZ-8PKJVRKqHEx","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytr_Ugh2KsJd76wATXgCoAEC.8PKIEnKIJ5W8PKMWltsp9C","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgiRtJgyu3XOaHgCoAEC.8PKET-OdEQT8PKHjC8CeJS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytr_Ugw5QcGlHW-SYgG-k854AaABAg.ASCW4G40CyTASXTW5urkfb","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgzhJIJZaNH5g6umGgR4AaABAg.AQRnd2U_LA-AQRnswyMdYa","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]