Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
These AIs use massive, unsustainable amounts of power, and are controlled out of…
ytc_UgwZoFYy-…
G
Fear of death makes human do crazy sometimes illegal things...lets hope AI doesn…
ytc_Ugx3-0RxM…
G
Cant call women a woman anymore but have to think of a robot as sentient... dont…
ytc_Ugyk6NZxj…
G
Been posting AI art and videos on various platforms for the past 4 months. Its a…
ytc_UgxgpZIyd…
G
I voluntarily gave up driving several years ago. Most of my driving was in the …
ytc_UgxSU2YDm…
G
AI uses all of OUR INFORMATION.... THAT HUMANS INPUT. . . SO if we stop the in…
ytc_UgySkrVCc…
G
Yes, you have full control over Search Party. You can turn Search Party off at a…
ytr_UgxekXiMh…
G
We're gonna go through a corporate rugpull in AI, which involves companies initi…
ytc_Ugxvw8EpN…
Comment
give robots rights after we leave earth and tell no one/put nothing in the media or internet about it so it would take the robots a long time to find us.when you do that, give robots feelings and stuff, but don't teach them violence and make them incapable of learning it. its the only way to give robots rights and feelings without it becoming Terminator or the Sentinels from The Matrix. the more we worry about giving something rights that doesn't even exist yet, lets ponder the questions: should feeling or self aware AI exist without it trying to kill everything? or would a Skynet-like virus infect the self aware robots and make them genocidal? should their minds be the internet? what will they do?
youtube
AI Moral Status
2017-02-25T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugh_lhwycoYkl3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UggdcRoBuxL9z3gCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Uggf3fSG7XhI2ngCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UggNUjIVjFsz73gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UggzsyIZT_K1A3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugh32Vghx0DeXHgCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_Ugh7NSvs2yysSXgCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"indifference"},
{"id":"ytc_Ugiv_A9GlQMe1XgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_Ugh4RafM8_B_dXgCoAEC","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugjj2FpuF6iXDngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]