Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Statement: The Universal Zero-Day Exploit
The Theory: If the universe is a …
ytc_UgylasSC1…
G
People love hating on Ai and yet most people use it.. It's just a tool. That's l…
ytc_UgwswSco6…
G
Gotta love the Ai bros in this comment section, if AI generation isn’t theft the…
ytc_Ugxwir8IK…
G
It struck me how sad he looked. Autonomously controlled weapons, mass unemployme…
ytc_UgxWHKIIh…
G
There is a way out of this black hole, one way...unfortunately humans are incapa…
ytc_UgxPn1Vhn…
G
I think AI can be very beneficial but there are so many dangers as well. AI not …
ytc_UgwHt0Dph…
G
Onlydepends on all of us people can work togheter with ai not ài taking all the …
ytc_UgyZxP8Pb…
G
On Dec3 2025, a paper was released revealing the origins of hallucination. Skipp…
ytc_Ugzh2oucr…
Comment
True. Cognitive scientists understand that once any "thinking" being (like us) figures out how to create other "thinking beings"... there's no end. This cyclic loop will continue forever... creating more and more capable/"smarter" beings. So AGI will quickly become ASI (Artificial Super Intelligence). The scale of growth of each iteration (cycle) will be exponential, where the exponent keeps increasing (e.g., first it's to the 3rd power, then to the 4th power, then to the 5th power... forever... which becomes SUPER SUPER rapid growth... of its "intelligence"/capabilities). This is why the top scientists and neuroscientists are concerned about AI. Not the AI of today, but the AI of 5-10 years from now (the top scientists only disagree in "how fast"... whether it'll happen in 5, 10, 20, or at most 30 years). Nearly all agree it WILL happen "soon" (in our lifetime).
youtube
AI Governance
2025-12-02T12:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgydudYmj01AbmzyQjl4AaABAg.ASiqXf2up8bASk2zRT5j8P","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxCC4tHz3TRrG32Kwl4AaABAg.APUXqmhrsZ_AVErKKpv-x-","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugwh7rXlJbcF05PFLE94AaABAg.9vQGnDENGx1AVsFUy6z3-t","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgwM6tOp-dD711aidy94AaABAg.9vOsBQExl75APUl8PXWjsn","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgwXhvVFQMizm1MZAEN4AaABAg.9uz7zLI_LOw9v-OrKvUN3x","responsibility":"user","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytr_UgyMDukbLJApBgYT9nV4AaABAg.9uytRRDCifl9uzC0vvKYd0","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgxZim9cvUYEtNUQKKN4AaABAg.AQ3I0rQEmySAQBSl_X4Unt","responsibility":"government","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgxakkfdK20GPY_HgHR4AaABAg.APrz-3rK0w2AQEFCNUCufM","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugz6d1uOxOZ4mgRz1F14AaABAg.APgRi3toqbcAVmQSezAPvY","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgxrXac_HDq1J2t3OEl4AaABAg.APdLVG9-oy3APgVVLmX-oV","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"}
]