Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
it's not really an AI lawsuit though is it. They've got them on the piracy again…
ytc_Ugw39Zvls…
G
Teachers do more and more on AI. Its robots teaching robots. Who even needs t…
ytc_Ugyx7XqFv…
G
Despite recent advancements, AI and robotics remain unable to perform manual tas…
ytc_UgxzWYvJc…
G
Learns by stealing from other people and jumbling it together is how humans lear…
ytc_Ugz3eqV1R…
G
Tbf, when an actual sentient ai makes stuff, would it be considered a xenos art?…
ytr_UgzoKgr1h…
G
I don't think so, just like their creator = human, the AI will likely be depress…
ytc_Ugyx2Onw7…
G
Companies are taking action on potential productivity, not actual productivity g…
ytc_Ugx0iRrYd…
G
Ai can't replace artist it just can't as crude as this may sound artist are AI …
ytc_UgxY4sCOA…
Comment
Think about it this way. If you create a flowchart with billions of steps and run it, does the flowchart deserve rights? No. Any "pain" experienced by an AI is only simulated because the pain does not correlate with a life threatening event like it does in humans and animals. However, pain isn't a perfect basis for rights. For example, in the Somatoform Disorders, human nervous systems are capable of 'misfires,' creating the experience of pain the brain/consciousness where no painful stimuli exists. Do these unfortunate people deserve elevated rights based on their increased capacity for pain? No.
youtube
AI Moral Status
2017-04-02T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghN0A8SEeh4RXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"},
{"id":"ytc_UghzlaQnMcZyqXgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugh6kh87bJEztngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgiK41GCIEutVHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UggpE9hB8ZGKUXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugjs7Uuups4vv3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"indifference"},
{"id":"ytc_UghF3cqakiS6zngCoAEC","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UggTOrD8M8fPnXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjHaZq_lbQMNHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgjHONlI3SmohHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]