Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Imagine a robot that feels like you. Thinks like you. Why shouldn't he have righ…
ytc_UgiKYV8v9…
G
AI will be put in place to do many many jobs that humans used to do and that way…
ytc_UgwtgR4Wf…
G
For some it looks like a really cool thing but I'm pretty scared what destructio…
ytc_UgyNKv5Y1…
G
A parallel would be shutting down all your rubbish processing facilities.
All t…
rdc_gx80pq2
G
@Primiumy Ai itself is free. Always has been. The money goes to the artists who …
ytr_UgxdIQ7vh…
G
No, the Superintelligence team is for aligning ASI not AGI. OpenAI has made it p…
rdc_l5mktuh
G
Who would have thought integrating Ai into their back end would screw them in th…
ytc_Ugxz46zfv…
G
Anthropomorphizing the whole thing is falling for the whole Altman Grift. Whethe…
ytc_UgwVS0BLN…
Comment
Humans are technically robots, but instead of metal and rubber we have muscles and cells. Instead of a motherboard, we have a brain.
So if we're technically robots in a biological sense, then what gives us more rights than an actual robot?
If any being has consciousness then it deserves rights. We have seen what happens when people/nations do not give others rights.
youtube
AI Moral Status
2017-02-25T14:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UghKoK55MKjPi3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgjLk4dwj6E7c3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgifpkGSnco6Q3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugh3oGA9UWKfbXgCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ughlf5QYg265ZngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugjui8lyYzSrvHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgiuXt-nUv5jbngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UggvBcByL6n803gCoAEC","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"approval"},
{"id":"ytc_UgjWed3DMfpEnHgCoAEC","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UghAqkiQfyzw4HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]