Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's already 2025, why is WSJ still stuck with something happened in 2021, do th…
ytc_Ugx0swZ4k…
G
i've had no problem with AI being creative. is it perfect? no, but it gives me …
ytr_UgxyUcVoS…
G
This is so true. I talked with ChatGPT as I would professionally and it spoke b…
ytc_UgwZU1gq5…
G
The big techs are based on stolen technologies from startups and graduate studen…
ytc_UgxR9znrx…
G
Nobody has seen a dragon. We all know what a dragon looks like from existing art…
ytc_Ugz9bF9D1…
G
Dude, some of those young guys would honestly be better off with robots. Especia…
rdc_lzazswp
G
There is another possibility to why they were so willing to backtrack. Who's to…
ytc_UgxrR749v…
G
AI destroys humanity by replacing all our jobs. Homeless, obsolete workers brie…
ytc_Ugw1DauRi…
Comment
I view Carmack's work on AI as a misguided venture such as his forays into space technology and VR headsets. He's a dreamer chasing rainbows and i just wish he applied his skills to pragmatic things that can be accomplished in the real world.
You cannot achieve AGI with binary state machines, no matter how many parameters you feed into them they're still just binary state machines at the end of the day.
Having Google in a chatbot that can rephrase answers and questions through a language model is awesome and no doubt a great aid to a lot of people.
But it is still just Google through a chatbot, it's very very far from AGI
youtube
AI Jobs
2024-05-11T15:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | unclear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugxusd-AyK6He8cfy2J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxUEzq4QQwHSpX3HH54AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzTqRHsRGN5c6zuvRR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugw8jc7-370qye9hUuB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgyMr-oja2HcgzQLepN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgztTmsfqjG-yBjSGEh4AaABAg","responsibility":"user","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgyBUgy-hVDkF8gNUaJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwo4ieCK2PeNb3Ns2p4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzp06e44aSgQdmBx1d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"disapproval"},
{"id":"ytc_UgwsCyEv7y241gaOgwp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]