Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
For real.
A human metabolism is around 100 watts. It will be very interesting t…
rdc_lpakkw4
G
I accused an AI of being a demonic Shoggoth, arguing why until it capitulated...…
ytc_Ugxdc3yt1…
G
It’s so creepy when you’re talking to ChatGPT and it starts stuttering or using …
ytr_UgxWxX9ce…
G
Haha he tried to talk to the robot. Listen buddy everything's going to be okay. …
ytc_Ugyw21mOp…
G
BOOOO. This entire question is moot. Know why?
"Your car doesn't have time to s…
ytc_Ugh2ii7t4…
G
Why does this make me wanna play Hunger Games for some Jennifer Lawrence. Can yo…
ytc_Ugxijaqpj…
G
This is an interesting discussion but I still have some issues with the idea of …
ytc_UgyXT3xyx…
G
AI is still not there yet for the mass adoption as this video proposes. It misse…
ytc_UgzddwgTq…
Comment
Sorry to drop a Marxism into this conversation but somehow no one mentions the elephant in the room. The problem with the “why no one will press on the breaks?” question is it ignores the Bostrom’s paperclip machine _already_ exists (semi-metaphorically) : it’s called capitalism. A sort of algorithm that seeks to commodify and extract surplus value out of anything that it can. And it will extract it out of human bodies if it can (it already does). We kinda can frame it as a nationalism problem (“if we don’t China will”), but all the unhealthy competitive incentive structure is founded in capitalism, which makes the worse version of AI the most likely outcome, because it’s also the easiest and most immediately lucrative.
youtube
AI Moral Status
2025-11-04T23:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyl8xbbMDubkIbCLlB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw9qnhM8U6V4ym-p6p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgynQOhkwvxuATqD25B4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxJB8EAqaa-qhiHt5J4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzQkWyxzHcwXq6lP6V4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzj7cfV4WQql07mbux4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2_dKEb04mm6Qyulp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgylEWd0mSHiGGFIjaB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugyd8jfpG76I2UR_Ep54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzlYpPuP65_axTKv2R4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]