Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The funny thing about ai image generation models is that they fall apart after y…
ytc_UgyspqrKF…
G
God if there's anything i want AI to do its doing my chores :/
( only if it doe…
ytc_UgzFKZz4G…
G
The scariest part about AI is the ability to trick people into believing shit li…
ytc_Ugz-E2CUs…
G
That was the point of the ad, the implication that humans make spelling errors w…
ytr_UgxUNstKW…
G
Note how the officer walked around the robot. This is a good photo opp for peopl…
ytc_UgwIVtBxG…
G
This guy was at Rhode Island Comic Con this past weekend and I tried to tell the…
ytc_UgwhQuz1H…
G
@nimrodery photoshop and fan art has existed for years, its virtually impossibl…
ytr_Ugxwmcvln…
G
If you were using AI, wouldn't you still have to give the machine an idea of wha…
ytr_UgyK5l_I_…
Comment
It's not about whether AI (or any inanimate object for that matter - think of sex dolls) 'deserves' rights. It's more about whether even normally decent people will start acting like slave owner a-holes or not. AI is a scarce resource and should be treated as such - don't waste it. We know that the internet didn't bring out the best in humans. So, culture and education will need to adapt and it won't be easy. I don't think legislation will do miracles, but it's a start - society setting norms for acceptable behavior. How it's enforced is a different matter. The example of animal mistreatment and its correlation with crime should give an idea - if not enforced, some people are tempted to let their inner monster loose.
youtube
AI Moral Status
2020-07-20T18:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[{"id":"ytc_Ugw_TfPRrOOOhJdA3xh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgznS0BoSSoOvI2YrRZ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwAN9YppoZmQ26k_AF4AaABAg","responsibility":"none","reasoning":"contractualist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzDstFwtt3R46fASG54AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMBoyOuZxuQJeHPyB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxL67iRnZmsn98cJg14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx7q_oEmI0YQ0VA2gh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwvrPQUl49mGlkzNBJ4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyZ1Q458X4GBQOHQYN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgwcHhN1oBEmIXh4vz94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}]