Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This guy aware A.I. also told people to put Elmers glue on pizza to stop the che…
ytc_UgxknD21o…
G
The real problem is why is all the data is out there fed to AI
Shame that every …
ytc_UgzEGlgiY…
G
@RewindOGTeeHee my point was, its always like this, there has always been argume…
ytr_UgwSw3X9w…
G
AI isn't really art because it's never original. People will be fed up by it qui…
ytc_UgzkSy314…
G
Fantastic video. I know autonomous navigation and vehicle control is the future,…
ytc_UgwpcdPv2…
G
Is this why the art ai thing keeps white washing my darker skinned characters?? …
ytc_Ugw_3m9BN…
G
Imagine busting your ass off in undergrad to get into med school, getting throug…
rdc_fcsz2tz
G
Lessee... existential climate change, gamma rays from a Betelgeuse supernova, Ye…
ytc_UgwqnJ-N1…
Comment
While this man means well, and someday his work will be necessary... this currently functions as propaganda for billionaires.
Gen AI and especially LLMs will not achieve AGI. It's nonsense. In fact, Gen AI isn't really going to get a whole lot better than it is right now. Current AI is simply not built in a way that it can achieve what this guy's talking about, at least not anytime soon.
Gen AI's utility, as it currently stands, can be measured by it's revenue - which is about 1/10th of what it spends on chips. It needs to cost about 10 times more, and it needs to not lose customers while raising those prices that much.
The only civilizational threat there is the stock market bubble it's creating.
youtube
AI Governance
2025-10-16T18:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | mixed |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxZYAv8uXW5Rq8OnHh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz_P0kiRybQYeJVF854AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxyNMABho-X1TS3s954AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugyvu7i70nIe5w2qjpt4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugw8qULQLdbBFsmGOaJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzk_CotArE_yVy3T-N4AaABAg","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy3khIuIHg7fB1GJVZ4AaABAg","responsibility":"unclear","reasoning":"contractualist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzSjHB33fyK3JxHOF54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxUKsYsaR-opcl_CKt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwfkpl9SdgSiTcPyIl4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]