Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s true there’s so many things a doctor must know to be able to diagnose peo…
ytc_UgwANgwgr…
G
That was absolutely brilliant! Maybe this show is all AI created? Haha nice work…
ytc_UgxqfMIXH…
G
I’m currently using a deepfake AI tool called “FakeYou”. It doesn’t mimic my voi…
ytc_UgxOSuuOM…
G
@jonnyfendi2003 actually it is coming for healthcare too. You can google ai heal…
ytr_UgyCqW3rT…
G
The problem is people aren't learning to understand the AI and seek beyond its l…
ytc_UgwToL80z…
G
We don't need or want. I feel empathy for those who are turning to AI rather tha…
ytc_UgwPv3mM5…
G
I cannot wait for the AI bubble to burst and investors stop investing to it…
ytc_UgwgdaAwI…
G
FAKE... the "shoes" do not move the gravel...at all. Not to mention, the human h…
ytc_UgxcDpJsu…
Comment
2:10 Ability to what? Suffer? You deserve rights if you suffer? I thought that to be conscious being must have an ability to make own decisions without a human (so called free will). We can't measure that yet, but we'll see if AI manages to reach some human level of legal capacity. So long, giving any right to a program will give some criminal right to say: "i didn't do that, it was a program I wrote, it has rights". Only than goes the right to protection against suffer.
youtube
AI Moral Status
2017-02-23T15:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | deontological |
| Policy | liability |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugg_QYru6wa1tXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UggT6GO8pu-LdHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgjfgNz3S53w43gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjXr8LkP1gp5HgCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UghAmBOHrhW17XgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgiUu5x4djUPgngCoAEC","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugg3KxPrjlzt6HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UghhrPdWpuzKU3gCoAEC","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"mixed"},
{"id":"ytc_Ugh0heeLSy6b-3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgioLnzTccRAJHgCoAEC","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]