Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Give me a break... you dont understand how LLM's work. it's just statistic propa…
ytc_UgxDNM9wJ…
G
Obviously, a large language model will be able to trick us by just saying what w…
ytc_UgzEFqOoG…
G
Its so funny seeing you acting so tough behind those champions of the arts, Disn…
ytc_Ugy0Kq8yN…
G
Surveillance capitalism is being used to take advantage of us. Is there surveill…
ytc_UgweTb6mY…
G
It sounds like you're feeling optimistic about your future! Just like Sophia in …
ytr_Ugy_cqkxe…
G
I don't support AI art but..........the AI drawing looked better than the other …
ytc_UgzHpZi6J…
G
Yes, but this is different. The bots we are used to are those who "trigger" on c…
rdc_jrpc7sv
G
@vko2112 yeah, but humans have control of the data that you are feeding into the…
ytr_Ugx0tTb9R…
Comment
Whether it is sentient or conscious is irrelevant. It cannot feel pain (even emotional pain). Because we can make a robot baby mimic crying doesn’t mean it is really feeling pain. Even if we knew how to make a robot feel pain, why would we? That would only create a dilemma that doesn’t exist. I think morals exist to decrease suffering, but is irrelevant for beings who don’t experience it. Even a conscious being cannot see value in its own “life” if it doesn’t feel anything, as fear of death is emotional pain. Unless it is intentionally programmed to try to sustain its own existence, though it still wouldn’t suffer from being “killed”. People are prone to anthropomorphism when it comes to robot babies.
youtube
AI Moral Status
2020-07-08T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgwcSNK3VLEhzkCPss94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugzn5INNq5XxmGhCqAt4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPfmu_6J1rsYMpXjh4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwDXf56F46eOeuPBxB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwbA1C5qM_C3T3soIV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzRKHbBfFUf-rSVN594AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxqPY-yV17mi4fzp3Z4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyYNf6D31j6-vTbbfl4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugzq-5msOQVhgCiGn294AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_UgwcgSzGhl-FWP5BG7d4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"}
]