Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I guess it might become a thing where it actually makes human art more desirable…
ytc_UgzZtHozl…
G
A few problems with this. Nightshade will be useless whenever new models come ar…
ytc_UgwG5vVG2…
G
It comes in 4 waves.
Step 1: AI replaced workers!!
Reaction: No no no no, wait …
ytc_Ugw6LXIeU…
G
Thank you for this tutorial -
Your repeated use of the phrase "you guys" is di…
ytc_UgwHBy8eW…
G
I love this. Combining this with what Thor (PirateSoftware) said about people we…
ytc_UgwI5WDxp…
G
i really hate the argument ai bros give when you try to explain that ai is theft…
ytc_UgxDCSJPO…
G
Bro, i suck at art, and yet i still think whatever i draw is a thousand times be…
ytc_Ugw457W6O…
G
Also, we’re automating away your bullshit cubicle job by doing away with the nee…
ytr_Ugy0G-lY7…
Comment
Let's say even if we hypothetically connect this robot (as it is now) to all the world's gadgets, including weapons of mass destruction, it probably will not destroy us. Well, at least not intentionally, or for some planned purpose. It will be the same as a little child given the access to press a nuclear button, who may press it just because he has access to it, but not know the consequences or what they mean.
youtube
AI Moral Status
2017-10-27T05:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyo3y0upsfTWbk7vVJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxr-__OSkeJmTxMdQ94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzNwnKu1u1lg8GOsDB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyg5CkZ5wJQk7qxpNV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyGbbdg7DbtHqTMVCd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxM0C6dgaBz13DB97t4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgypwDhF1nMzvX9ECFd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgwuLDWvybTIbNFh4YF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw8whUT4-0cRmHIDAd4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugza3KIlIWIuvULq_G94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}
]