Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Please note this would be completely voluntary on the part of OpenAI due to the …
rdc_nc32b0d
G
Like all man's inventions there are positive and negative consequences.
The adv…
ytc_UgwcvHhjk…
G
46:50 I think for overcoming this type gap between AI and human, we need to give…
ytc_UgwQFStAG…
G
If anyone’s looking for a free option, Clever AI Humanizer does the job perfectl…
ytc_UgwwvcF6n…
G
Several issues with your arguments. Im not rage baiting, but pointing this out f…
ytc_UgzV5LCTt…
G
I always go for scientist style prompts because it appears to produce most accur…
ytc_Ugx682vXO…
G
I genuinely believe it's too late to hold a conversation at all. People are alre…
ytr_UgxXJL5g5…
G
Don't worry! The conversation in the video might seem a bit intense, but it's al…
ytr_Ugw6XP3Eb…
Comment
I can't say I agree with the idea of robot rights. There code, not brain or otherwise. Thus they are philosophical zombies, able to react and act but not able to really feel as us humans do.
youtube
AI Moral Status
2017-03-06T04:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghKkbKM7RfTSngCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjI9lR0B-QpLngCoAEC","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghPZpawqsXIxngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugg3YIAoHWeF73gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UghZs-vx_DY4WngCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugj4TBYHcuy8QHgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_Uggj6wVem7oUqXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UghZ2Kej7Awjx3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghHQZM9DEXzg3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiGAv21OsCOaHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}
]