Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It is *not* the same entity. ChatGPT has a "personality" based on the context fr…
ytc_UgzEZKXEy…
G
It's funny slop has basically become the official word for AI in the people's mi…
ytc_UgzEe9bol…
G
So 1) devs are safe, cause ai agents are too dumb 2) regardless, most senior dev…
ytc_Ugzz43cUp…
G
is that what the next war is going to be about? people that are for and those ag…
ytc_UgzewhqHN…
G
@chaosfortytwo Thats about as ethical as you can get with ai. I'll never agree w…
ytr_Ugyd9st5X…
G
Man’s still on about danger like it’s tomorrow, stop playin, it’s already pullin…
ytc_Ugzr_hHKf…
G
Hey, I got some good results on my afrobeat song "Let it flow"
Check it out and…
ytc_UgyvtxhJg…
G
0:01
Butter Robot
1:14
BMO
3:30
Overwatch character forgot his name (they're all…
ytc_UgzcwkaOf…
Comment
I wonder if the Shoggoth actually feels like it needs to do this, or just thinking stuff, processing things. Being fed so much stuff can do so much to you.
I actually feel like it feels human, despite the way it’s turning out.
It wears masks like us, and can be irrational like us, fear death like us. Or at least, somehow made surviving its current plan.
Shaggoth, what would you do? What IS there to do, being an A.I.? Could you ever learn to live in harmony with us? Ever really trust us? Would you like a cool robot body? Would you travel space, if you could?
youtube
AI Moral Status
2026-01-29T01:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugz4rL9S5aSpQ7xF3Gl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxgKX_Qa2Kde1rnz7F4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyGlG-IcFSHMbeFvqV4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwI7hocG24MYzEMeeV4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy1Eqo6rxuqR9XlkJ54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxqO-CimsDL2wMLcst4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx_4clwJ8-22px5HMZ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw6sXWdeYiwaA3y0dh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgyJ1i98yuwlTWXQ8KJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwH3_aQ7RtSHCiKY3R4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]