Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly. If your writing can be replaced by Ai then the death of creativity isn…
ytc_UgyNcn4pP…
G
You shouldn't be considered an artist if you just ask an AI to generate a certai…
ytc_UgxW0MDq8…
G
A good place to start regulating harmful AI's is with the all-powerful Big Tech …
ytc_Ugy_5cQLf…
G
While this video highlights important real-world challenges like mass layoffs, i…
ytc_UgxgmWSaM…
G
They need to be exposed on shows like American Greed, they figure it’s cheaper t…
ytc_Ugx3G24Qp…
G
When OpenAI feared legal action from the Hyperonics Hypersnap screenshot they to…
ytc_UgzwMdCLc…
G
you don't get it alot of effort goes into thinking of different words for breast…
ytc_Ugy8ZIz2g…
G
“Pentagon Defends Child-Abusers While Attacking Some Vague Attempt At Morality I…
rdc_o78ugi6
Comment
There's no need to create AI robots which fan feel pain and sadness. The earth is already suffering the burden of large population. We don't need to create another species who will enjoy their life on their own without making our lives easier. A robot should be intelligent enough to help humans but I don't see any reason to give them human emotions.
Would you feel sad if some one unplugs your toaster? No, and that's why we should not build humanoid AI robots
youtube
AI Moral Status
2017-02-23T18:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | resignation |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh8Be6KyQwV-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_Ugig6ZaSL0xYUngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggpmXzTxCn0_HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_Ugg0JlDKIdxowHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjucK8bclx98HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"resignation"},
{"id":"ytc_UgjxYcYgD_mbE3gCoAEC","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgjxlQ5IIqou-HgCoAEC","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UghWat3HN-CRn3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugg6_7WvjPQi53gCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UghrPpy0tE2CDXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]