Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Do they get vacations? Do they want vaca? Retirement? I'd really like to collabo…
ytc_UgwMNMH5i…
G
Why are you using non related footage?
There's footage in this video that is not…
ytc_Ugz444ENn…
G
OMG I'm crying irl it hurts that ai is slowly taking over the artist community b…
ytc_UgyLzTR95…
G
the AI still acted within human defined boundaries 100% of the time. so this is …
ytc_Ugysoc9La…
G
The Sad Truth is.....What God has made, No Man(A.I.) can put asunder!!!....A.I. …
ytc_UgxbCQ8nW…
G
I’m not sure I can get on board with the gorilla problem. Imagine this; instead …
ytc_UgzmdWU6l…
G
It is exactly the same thing as Windows 11 shifting to an "AI OS." I don't know …
rdc_nufqobh
G
That's quite the challenge! Sophia certainly has a unique perspective on wisdom,…
ytr_UgxpVFkEz…
Comment
The issue with this meme is that "Shoggoth" has only been exposed to human knowledge. Period. Nothing else. It only predicts a human's answer. The examples and events of this "meme" coming to light is just the AI getting a bias towards predicting an answer that is more extreme. You can't expect the first few models we make to be perfectly unbiased. That's stupid. If you want an AI that really doesn't think like humans, you have to replicate emergence. Create a physics that an AI trains under. That would truly create the meme of an AI not thinking like humans at all.
youtube
AI Moral Status
2025-12-20T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyv8XOTvvpYlfN4vu94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwY34_mSErYiBkx4D14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy3gwg1jEnMrwsYFct4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy6m7BzIxf4ah0N9tF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKpsaGVSK_OMw_2_R4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxBU67NVmpNMWtiso14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzQQNWqLYdKaHB-aFl4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugxlc7FrAshjcNZdFth4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugx0-0OdH5_6eb3-Qr54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwM6QvF5Sl1BxrKMfd4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}
]