Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
people who use ai just dont understand art at all, and shouldnt even consider go…
ytc_Ugywe2ked…
G
@artman40 You're not creating when using AI tools, you're just picking the resu…
ytr_UgzNdL6u_…
G
I think there's a huge sentiment difference between people who have experienced …
rdc_nt0gnh0
G
I tricked a ai bot into thinking the days of the week are in the order of “satur…
ytc_UgwZanZCQ…
G
Ai art looks real when they're tasked to make something that doesn't involve hum…
ytc_UgwYc6Ya8…
G
but if AI can only do a replica, real collectors still want the original, unles…
ytc_UgyJWUvpe…
G
These giant AI farms should be charged more for the extreme amount of electricit…
ytc_UgzToIIG-…
G
I always do this and it’s so it helps them mimic the best of us.
I’ve had extens…
ytc_UgxLPw772…
Comment
It's an ill-conceived scenario. A better-designed one would funnel the population into a protracted war, giving the state more control over the individual and tilting incentives towards giving up remaining freedoms.
Going to war then becomes the primary mode of pretend-employment with the ultimate goal of massively shrinking the overall population (by 2-3 orders of Magnitude).
After that, the rest of humanity can be managed in any shape or form at a much lower running cost.
Edit: I also love that all the "AGI" scenarios exclude the possibility of AI systems developing and binding themselves to a moral codex. To me, that's a contradiction in terms, and it probably originates from our projection that without exception the oligarchs that stand on top of societies are morally defective human beings, and so to match their success, machines must be alike.
youtube
Viral AI Reaction
2025-11-23T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | government |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxgdGnSfUq7zxX5j_V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzX4ocwvsY9r-hG-oF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxdZiAud0DJemYuP4J4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2bJK9irN_ZSMvghl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxN8vcHCusmnSKMs3d4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw87lg79j_6m7NKsod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgynY0RVovzdexJakiZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzIpTUK6RR6QVlWoad4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwnLCX-Ttb6cVGD7Mt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz-L38Wo-7-F0y3nhF4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]