Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI critters also forget we artist do this for FUN! Enjoyment! The thrill of fini…
ytc_UgwaLo604…
G
You seriously can’t tell? It’s pretty visible it’s a deepfake and not the real T…
ytr_UgyR15XXy…
G
A potentially rogue/sentient AI with an IQ of 20 could still be extremely dange…
ytc_Ugxsh2-y7…
G
THE HUMAN IN THE CAR IS RESPONSIBLE WHEN A SELF DRIVING CAR GETS INTO AN ACCIDEN…
ytc_Ugyuooj4e…
G
I'm not supporting ai artists, i haven't a strong opinion about this stuff.
But.…
ytc_UgzjpotUT…
G
Difference is, the 30,000 that died could have controlled the outcome. If your s…
ytc_Ugz1YNjhb…
G
US Lawmakers? They can't repeal daylight savings or fun the government on time y…
ytr_Ugy-Fnv90…
G
@Simboissça analyse des mots clés que tu utilises ( par exemple quand tu utilis…
ytr_UgyICo5Fw…
Comment
It makes we wonder; should the question be should robots be sentient? Do they need to be? Do they need, or better, do we need (as the robots, after all, are being created for our benefit in some way), for them to feel pain? With AI being touted as an existential threat, it seems odd that we care about giving them rights because of capabilities we willingly gave them in the first place, something which, for the sake of our own survival, may not have been wise to do so.
youtube
AI Moral Status
2018-12-17T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwxbGDsqCiSNuQhRfR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwVgsUMRlcixxZfw8p4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx631C12qaWO8ZV5vN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzT9okUcSlUw_n7VSd4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyipJWvs2wfjFQCHOd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz5JhTK_u6mxG2AHjB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwOauXWT3WGLwPcCXV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYcPwVex6HZFsb2kl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUSa2V7YjIYKavGqN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxbAV0LZJaLxgRItnl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]