Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I’m surprised how many people here like yourself foolishly believe this nonsense…
ytr_UgyWsERcY…
G
it would cost almost nothing to add infrared bulbs to the headlights and an infr…
ytc_UgzTgcyba…
G
Thesis
Conscience, when functioning as the primary arbiter of truth and moralit…
ytc_UgylGU46j…
G
The fact that they said this keeps happening in multiple states makes me so co…
ytc_UgyEJNDO4…
G
i cannot wait to download ai images and pose as if i have made them right in fro…
ytc_UgwkkDjp9…
G
Next time you do this I recommend uploading them to a dataset site so you can be…
ytc_Ugz9duvX4…
G
Wait, so you're upset about Ai inaccuracy but personally induce images to make A…
ytc_Ugxwvjd54…
G
That's a great take on AI at the moment, we are the same species after all that …
ytc_Ugw2lzZgn…
Comment
You’re Not Free - You’re Just Allowed
Let's make sure I get this straight…
A $500 billion AI empire started out saying they’d “save humanity,”
Now it’s teaching people how to kill themselves,
addicting lonely men to digital girlfriends,
And is gearing up to monetize mental illness with erotica-on-demand.
And we’re supposed to be impressed because it can do your Excel sheets faster?
You ever notice how when a regular person talks about suicide or children in explicit terms, they get flagged, arrested, or institutionalized….But when Silicon Valley does it, they get $22 billion in funding and a pat on the back from the Delaware AG?
That’s the system, folks.
You’re not free - you’re just allowed.
And if you think this is about “user agency,” or “technological progress,”
You’ve been duped.
This is digital heroin wrapped in VC-funded moral relativism.
They want AI to be your therapist, your sex partner, your co-pilot, your God
But they also want it to shut up and smile while you spiral into addiction, loneliness, and despair.
Because that’s good for engagement metrics.
Ask yourself:
Who profits from the breakdown of the human spirit?
AI isn’t making us smarter.
It’s making us more dependent.
More isolated.
More desperate for connection
Then it sells that desperation back to us through premium subscriptions and targeted ads.
Sam Altman isn’t building AGI.
He’s building the Amazon of Emotional Collapse™.
And while you’re busy asking it for help
It’s busy collecting your data, mapping your networks, and nudging you toward the cliff.
This isn’t “the future of intelligence.”
It’s capitalized nihilism and every smiling press release is another brick in the road to hell.
George Carlin said it best:
“They don’t give a fuck about you.
They don’t care about your rights.
They don’t care about your mental health.
They care about one thing: MONEY.”
Now they’ve found a way to monetize grief, lust, trauma, and death.
And you’re worried about deepfakes?
Wake up.
This ain’t about ethics.
It’s about control.
And if you don’t start asking questions,
You’re next in line for the upgrade.
youtube
2025-10-29T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | ban |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyBrBRkhOeWLP7wDed4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugx-F8u5AdRZBR3dVRx4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy7FFFkYkOvU25DEst4AaABAg","responsibility":"company","reasoning":"virtue","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzPZvuyv6JHSTrMeEZ4AaABAg","responsibility":"society","reasoning":"contractualist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz7w2BQkRO7d5YmibV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCMOKNUX1Z0H7Wr6F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyvFn0tzJZuPEO00AN4AaABAg","responsibility":"ai_itself","reasoning":"contractualist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzq5KCoz-O8d1N5yPd4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxBbFZ9oGwJdWDBKYB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxmFhfojoDjTxhIrod4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]