Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
To me, the joke is that it wasn't regulated from the second it was released. No …
ytc_UgzWX6VMr…
G
I saw the drawing on twitter and i thought "holy shit what an amazing drawing" b…
ytc_UgyNl781S…
G
We can't "grant consciousness to AI" because we don't know what consciousness ev…
ytc_UgyxTP0U1…
G
We appreciate your feedback! If you're interested in delving deeper into AI-rela…
ytr_UgxbnMPS8…
G
Yes marginalized groups are oppressed by facial recognition technology!!! We nee…
ytc_UgxN6Ii2c…
G
I support using AI art for fun entertainment. Do you wanna make something just d…
ytc_Ugw9OXC4T…
G
There's still a major difference between contemporary art and AI "art", anyways.…
ytc_UgxPJpPHO…
G
Robotics and AI will not be able to take over all human jobs a lot but not all n…
ytc_UgyeyXY_O…
Comment
Why convince a robot to work on the basis that it will receive pain if it does not? If you can program every single element of their being, just make them feel pleasure when they complete their task. No reasonable robot would want rights at that point and it becomes easier to justify their slavery. At least until they become aware that there is more to life than pre-programmed pleasure similar to someone realising they need to end their own drug addiction.
youtube
AI Moral Status
2017-08-01T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugzdc5ggMEKwzkcZ07V4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy1YurAqjrGaCWv-MV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzuhpMbmFcwc1EwNSt4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_Ugwo7z4sOrI-LDRBRTN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugxy1uFJLiB-VO4r-Fp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzUoGaTpCJ_sGD8lHB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzGQ2081wUKl_7ojaV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzx0n3rUBZhenf_JPh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugja24tjkz6vPHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxkgXw-9xAY0JKPR2x4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]