Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Except AI can do those jobs too. Humans are only necessary if you need human-fla…
ytr_Ugy99dODp…
G
I did dabble a bit in ai image generation a while back, just out of curiousity. …
ytc_Ugx6G1ZaM…
G
I do not read AI robot to tell me the story of these Palestinian losers/terroris…
ytc_UgwDei0yV…
G
of course we still need to learn it if we are doing any tech job. but not the wa…
ytc_UgxVXIfXw…
G
It’s ok but having a AI is a bad choice to have or build. Creators should be on …
ytc_UgwfQuRZx…
G
AI still not a human. It became dangerous if the people not responsible and not …
ytc_Ugwsy7OWz…
G
It hasn't been valuable for YEARS! Ai is just making the problem worse. But nobo…
ytc_Ugx_HvRv4…
G
It may be generating false memories, yet still retaining the neuronal process of…
ytc_UgwyNJ9Q_…
Comment
I'd argue that programming a robot to feel pain just to be more like us would be morally wrong. Better yet, if we could, in theory, give a robot programming that let it feel endless pleasure and completely ignore the fact that it's only real purpose was to serve humans, would they ever mind?
youtube
AI Moral Status
2017-02-23T22:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghhOyFAuDaTTHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggZWyJDfZsx4HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugg36Q7TFmanhngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgjixkGRfglkyngCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgiN2NjIkn9_MXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg-qYN95-Pf5HgCoAEC","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgjFwmJ0dxkyO3gCoAEC","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggSKKvGLzax03gCoAEC","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ughjrlezex4FengCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UggiBEYoD2pkIngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"mixed"}
]