Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
So is AI replacing the government workers too? That would be fantastic to have s…
ytc_Ugy6_aRK3…
G
I've got one dooms day scenario I've actually run into. AI that prompt inject e…
ytc_Ugw0MDQVH…
G
okay imho we should be teaching AI not it teaching us.... that just sounds like …
ytc_UgybJCcqK…
G
That ridiculous "petition" is just a marketing stunt. OpenAI once claimed that …
ytc_UgwahZYf0…
G
@TheAxel65 Compassion and intelligence starts with you. Work on yourself and le…
ytr_UgyjuduqP…
G
Could happen.
Still, people are attracted to the artists, their charisma, energ…
ytr_Ugx80lD3B…
G
My class in college had a discussion on this about when to use and not to use AI…
ytc_Ugz1pKav1…
G
If some deity zaps a "soul" thing into a zygote, and then that zygote splits (as…
ytc_UggnbSHjI…
Comment
The thing is... There's no such thing as rights. Rights are given by society. society is a fabrication of humanity. Humanity is an animal species like any other. The only reason we have rights is because the more intelligent humans wished to create an equal opportunity to excel in life as the physically superior. A sort of Darwinistic technique. Reshaping the world we live in to make it more suitable for us, rather than reshaping ourselves to suit the environment. Now we have rights, society, and morality (all human fabrications) to keep our personal strands of DNA from dying out if we so choose. So again, at what point do robots deserve human rights? At the point it begins to breed and evolve autonomously.
youtube
AI Moral Status
2017-02-23T21:0…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghuajyaEp-f2ngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ughc2p2nCNjxJXgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgiCB0Ue-nmp3HgCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggLNnDgPs6OkXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UggHNay1_1QKdngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugj8LRBjSUMUTXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UghABVVWGR4FDngCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UghgV-JOQJbd63gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugi1qsXI9y5oT3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiuBnZPsnJdv3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"}
]