Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Seems like you just don't like AI's answers.
It's more likely that your opinion …
ytc_Ugynx-2Oe…
G
What Was The Problem With Self Driving Cars Nobody Is Talking About? I did not g…
ytc_UgyID21uS…
G
I wish the best of luck to you. I see things are going well already. Making conn…
ytc_UgyRqRtRt…
G
How funny is that we invented automation tools to make our life more slacking, y…
ytc_UgwkQe-kl…
G
The samples you gave shows how one replicated the other. It just cropped the ori…
ytc_Ugzykh0Ap…
G
The respect I have for Alex just rose incredibly, for the masterful way he manag…
ytc_UgyS5rHTD…
G
its called inproper generalised AI: It is the users fault because you did not tr…
ytr_UgweZP0B5…
G
AI is absolutely awful at coding. If you ask it to do anything even semi-complex…
ytc_Ugx1dJZRl…
Comment
Just started the video... before I watch it, I really hope we're talking about "LLMs to become AGI by 2030"... cause if that's what we're about to see here, then there's NOTHING TO SEE HERE!
LLMs will never... EVER... reach anything remotely CLOSE to AGI... let alone "Super Human Intelligence". Why don't people understand this concept? LLMs can't be "smarter than a human"... BECAUSE THEY LEARN FROM ALL THE THINGS HUMANS HAVE RECORDED ONLINE!
Scraping the entire Internet and then somehow being able to come up with more than what you find on the entire Internet is like saying, "I'm going to buy all the land on planet earth and then I'm going to somehow come up with a way to have 10 times as much land as I own after I buy all of it."
Nothing to see here folks, move along.
youtube
AI Governance
2025-09-04T14:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugw2TTC0bU45v-y6K6F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugz8-dk0k9v4RrdePHx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},{"id":"ytc_Ugy7T1tHqMdt7PV6baV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgwShwnhD1kJu-qA9zh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},{"id":"ytc_UgwRcy_haqMhxlYIyeJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},{"id":"ytc_UgwpCxRGn9hKYw2EgbB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},{"id":"ytc_Ugz1JXKYbcbAkD5CcsZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},{"id":"ytc_UgzSfB5cSFNiq9GKNZJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgzdGHhIxX0P109yhx54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},{"id":"ytc_UgyPXgclO0cMNpuGb7l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]