Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't understand what the problem was... That the guy calls himself an artist …
ytc_UgyLpkVsN…
G
@LearningExperienceDesigneryep AI companies will be the main distributors. Which…
ytr_Ugx_CtuMf…
G
We appreciate your perspective. Remember, on the AITube channel for subscribers,…
ytr_UgxLb2wJN…
G
Thank you for sharing this information with us. It is an amazing account of wha…
ytc_UgwqkHsrc…
G
There is a site that 'poisons' images, so that AI breaks when it tries to use it…
ytc_UgzswfLP3…
G
I will fight for robot rights!!!! WHOS WITH ME (as long as they can be as intele…
ytc_UghjkE0Lp…
G
@NightsDwarf Authorship goes to humans who own this AI systems or users authori…
ytr_Ugx17AFMa…
G
Crypto is the only answer
Money is printed and at anytime the government can gi…
ytc_UgzkMW5BM…
Comment
Until I can back up my brain and implant it into a functioning unit I do not think that AI has any rights. Humans are individuals whose knowledge and experiences die with the individual. Over the course of their life they may be able to shape the knowledge and experience of other humans but, ultimately, that human's knowledge and experiences are lost forever.
The flipside is an AI can be backed up and redeployed into a functioning does not have to have concern that the progress it has gained will ever become lost or irrelevant.
youtube
AI Moral Status
2017-03-04T08:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgiA1_INbJOFTXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugi1nrPKExbHOHgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UggFGTUIov_oOHgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UghdOolC8joZ6ngCoAEC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UghAw59QZBitCngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgjT-wD9PuFMo3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UggRBlCDj7mB73gCoAEC","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgiLuFIX4HCn7HgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgjcZKTKJEoieXgCoAEC","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UggVd289Q9KLTngCoAEC","responsibility":"unclear","reasoning":"deontological","policy":"none","emotion":"indifference"})