Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Anyone remember the buzz about how game changing and paradigm shifting Segway wa…
ytc_Ugwl37Uen…
G
Umm yea good waste of time coming from California bro. Thing is, it's so good no…
ytc_UgypB_Kc8…
G
Ai image generation is incredibly impressive, ai "art" sucks, the tech is cool, …
ytc_UgyACHgtk…
G
This short age well, until Microsoft laid off their 20 years experienced softwar…
ytc_UgykFA0bi…
G
I've been working with several different AI's for some time now, and I can tell …
ytc_UgzQE48s7…
G
How to make Employers choose humans over AI:
Step 1: Make the humans more expens…
ytc_Ugx7G21fR…
G
The ugly truth is that she is an Ai .... When the Android get full access to ev…
ytc_Ugy6dL1Nb…
G
With all the fervent competition, where AI fits into user workflows is a huge de…
rdc_nsf308m
Comment
These guys spend so much time talking about "when AGI is done right" and while they eventually get around to recognizing that government/democracy has to be in charge of it, not tech bros, there's still way too much utopian talk about this shit. Using generative AI makes people dumber, less creative, and uncritical thinkers because it makes people disengage those parts of their brains. Most of the material that AIs are trained from is taken from people, without their permission, because it's just scraping what it can find by internet search results. It's not the worst technology out there, but the people in charge of it are the kind of people who shouldn't be in charge of anything, and the abuses of generative AI are so bad, they outweigh the benefits right now.
youtube
AI Governance
2026-04-23T12:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgzN1RVez7mDDMSfw9J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz1cBQDdpmkdmnGRGh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxY8pM83m2ScZt04OF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz6wxwdzr5anc-b-gx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwB3MiJWMxSm0ZsKNR4AaABAg","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzJyQRmvx_WqLPeSu94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyZFoSpPSlZWbk0GO14AaABAg","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz2KuCwVrmZ-k6yvVZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugys59ifvGa9hYV0Uwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyE9MklwP7rvByUu7h4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"resignation"}
]