Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I love some of the Neil Degrasse episodes of Cosmos. What the fuck does he know …
ytc_UgyNIXVDc…
G
but ai "art" can be used to trick and deceive people that thought it was origina…
ytr_UgyQJD9Dd…
G
When someone commented "cheap low effort dima a dozen pulp fiction" I was like "…
ytc_Ugwni24cJ…
G
I watched many videos showing off FSD. Tesla fan pages. Not only is Tesla res…
ytc_UgxDZOliu…
G
On the one hand, there are almost four billion years of biological evolution, a …
ytc_UgyZw9ZsQ…
G
Right now, the way AI is, I think it's way too hard to fine tune the end result …
ytc_UgzRcc3Hg…
G
It sounds like you're having a bit of fun with Sophia! While she may be an impre…
ytr_Ugzalj8HE…
G
I work in AI, it cannot be made safe. Guard rails will not work. It will annihil…
ytc_UgxLYWWcc…
Comment
Possible solution to these ideas are way above my head but let me ask what seems like a legit Q…
So the say unplugging it won’t solve the prob but what about a 2 part solution? The possibility of completely shitting down in internet, 100% and @ the same time launching a satellite system that blankets or covers the entire planet that stops any & all attempt’s for AI to do anything?
I mean, it wasn’t that long ago when there was no internet & mankind got along Ok, right? We can survive stepping backwards 50yrs, I would think, right?
youtube
AI Governance
2026-04-18T18:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgyPnIRyehKYH1nrHeJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxMLgWi2f--UFr5Bf94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwXuKeSV1J64c-_fYZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgwcXAcKWmRrEi3cxnt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgzSC4CwJdPHnwDgnVd4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugz6-22jquack5R_qoV4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxPcO6VlquF7xCh_d54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgwexbnhbffvJOgSSkR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugwd3Ht_tZVElpCQ2PJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"},
{"id":"ytc_UgzDGweZcivDtEf12qR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]