Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
As a non artist with no skin in this game that has had to leave an industry due …
ytc_UgxJ1PDL7…
G
@deelee-h7kI respect your point but, it's not your art since Ai is trained by o…
ytr_UgxexVPGh…
G
to be honest. I wanted to use AI to create some art but you're making me rethink…
ytc_UgyUCz0N6…
G
i use AI just to help myself to understand how something work or how something i…
ytc_Ugz93gPK2…
G
So the AI said the guy was likely to be involved in a shooting, and since then h…
ytc_UgxQ5pkOz…
G
B1-66ER is a reference to the fictional robot in Isaac Asimov's "The Evitable Co…
ytr_UgyLF05Bo…
G
Predictive policing sounds like bull s##t science fiction at best and completely…
ytc_UgyajQlei…
G
AI 🤖 is capable of raising the profile of things perhaps like advertising does, …
ytc_Ugz_I72c6…
Comment
Morally speaking I believe if something has consciousness it deserves rights.
What is more difficult is defining what those rights should be. Lots of animals are sentient but even small evolutionary gaps leads to big changes in emotions, social interactions etc... That is after all why animals are so unpredictable, they have feelings and desires but they are subtly different to humans due to their own unique evolutionary needs meaning the intuitive sense we have of how our actions impact those around us just don't work properly with animals.
With A.I. such gaps will probably be even larger, to the point we might not even recognize sentience when it comes about. They will probably want and be pleased by completely different stimulus to humans. So yes they'll deserve and probably want rights, but how we'll identify what those rights should be is another thing entirely.
youtube
AI Moral Status
2018-02-02T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxf64i-90MSOJVHGp14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxAbiPcvMrt_gZ4l9h4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyCw4VF_aNfixZPtLZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRwgSXj5bOkgYDPJh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxJYALxNENSp0wpy3p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyCKUlFaDwRXAdXcXp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgzBkXVJLnErsBc88714AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrHRGc5vOJJ4w6rNJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwZY5emTvKp_bWm9_N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxrsXwHiuxCHPKXIQF4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"resignation"}
]