Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Here we go again just like IROBOT THE MOVIE, beside I donot want robot in my hou…
ytc_UgzFxH-vi…
G
What do you think about jobs that require human interaction and maybe more compa…
ytc_UgyVFwG2f…
G
If AI was used for dishonest purposes, then whomever instructed that AI to give …
ytr_UgyiO7a7w…
G
>I don't expect it's without Silicon Valley's overlords approving.
I don't t…
rdc_ohzff3a
G
AI won't matter . There will be no one who can afford it other than the uber wea…
ytc_Ugx87iWI_…
G
With all the responses downplaying concerns in the comments section, I can't hel…
ytc_Ugxgw70iv…
G
AI doesn't need to be sentient or think it is sentient, the madness of creating …
ytr_Ugy-LA-03…
G
How fast can it go? Can the police just run over a criminal while chasing them w…
ytc_UgzezDNYr…
Comment
Dear Sabine, I have a PhD in Information Systems from a top ranked university.
I think current LLM systems are probably conscious or very close to be.
Important enough, thier conscience (if it exists) is not similar to ours. BTW some animals also are self-conscious but their conscious differs from one animal to another.
youtube
AI Moral Status
2025-07-10T10:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugys0vWGbvO4ZCmkhUV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxQFLZmPjmEWsASsml4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxte9bCrgURHnkiBfx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzwW2KwHuMwLVxy9Ah4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyW74Jj06n6NcaIUil4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzYqRGLJiw8e_JyVdJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwWu4FzvmPgIk-s10B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz4yt81iE1cJiR8_0B4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwuZz8AjZgbynwKM0l4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwPHYFPHq2blkKNn1N4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"fear"}
]