Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I woke up today and DREW some beautiful AI art for my game. Then I VIBE CODED an…
ytc_UgysLH2TM…
G
ai itself isn't an issue, its all down to how its used.
That said, if you don't …
ytc_UgzHp672J…
G
It's interesting how, once released, an AI becomes an object of learning for oth…
ytc_UgzU9v2GC…
G
I had the worst experience with gemini llm. Omg it gave me directives and assure…
ytc_UgwRVMqxE…
G
I'm a electrician in automation... We need people working on the ground... This …
ytc_Ugyz7udAt…
G
How come developers of AI fail to see that the very thing they create today may…
ytc_UgyQbcQh6…
G
Is driving a truck not just combing through a ton of data and making decisions b…
rdc_fcrvn0e
G
One-sided and only promotional. The paper that Sal mentioned is credible but dat…
ytc_UgxY4ZHLU…
Comment
I disagree strongly with 17:54, if you just learn to learn, if you just learn to pass 1 single test but never actualy use it or require it in practice you will forget within a month.
I had a friend in school that couldnt code, was very bad in logical reasoning and never applyed anything he learned he just "remembered" stuff he remebered just everything he wrote down in leasons to pass the exams and then forgot about it again.
And honestly if i hear him talk he did exactly that, he remembers pieces, part of the termonology, vague concepts but nothing in detail. AI just enabled him to think he actually knows stuff, i dont think AI is the reason he doesnt know stuff because he probably never wanted to know stuff AI is just what enabled the dilusion.
And if im correct with my theory his school did proof anything its willpower, learning 4 years without understanding shit requires a lot of willpower.
youtube
AI Jobs
2026-01-15T19:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | user |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyA0G8vkYYbXWO-WkV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxpnR-dcRm5tQtvNjV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxo_dnBPxZvFvqgOVZ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyVQvC30kqET9cR8PJ4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwGZZE95L8i_NBFnOp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwOyrF2vT_KiNr1S-F4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwsr8QT8dSzye1MiCV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"sadness"},
{"id":"ytc_UgxuzccYaxIdEkP8cC94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzIMk4w6ncnQSOVU2N4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_Ugwx2MF7DhPBEx8wNed4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}
]