Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI may have been the reason for all of our wars since the early 1800s…
ytc_Ugy6qIocF…
G
Very disturbing. I hope he wins a lot of money. And I hope they quit using fac…
ytc_UgzaXagrP…
G
As someone who uses AI Chatbot's, it's so scary to see these kinds of stories, w…
ytc_UgyeEG9uI…
G
ChatGPTS response to this video:
🧠 What the video claims
• The title and desc…
ytc_UgytRl29S…
G
Art is a conversation. You can't have a conversation with a machine learning mod…
ytc_UgywHjwwa…
G
Ok, if you believe the bs about the Bible.
Ask ai who Yawe is and if he is the …
ytc_UgwM6i3Ft…
G
I really want to get into art I used to draw alot when I was a kid but someone s…
ytc_Ugz_bJU4U…
G
Well that’s a deception. It’s already happening.
Elon makes very good a very po…
ytc_UgwXn1GEx…
Comment
Other platforms like Manifold Markets have speculative questions about AGI timelines, but as of 2025, no high-volume, real-money markets (e.g., Polymarket or Kalshi) show clear probabilities for AGI by 2027-2028. Manifold’s community-driven odds for “AGI by 2028” hover around 20-30%, but these are low-stakes and less reliable than Metaculus.
Skeptical voices, like those in a 2025 AAAI report, suggest 76% of AI researchers doubt current approaches will yield AGI within a few years, pointing to limitations in reasoning, context understanding, and data. This tempers the 2-3 year optimism, suggesting longer timelines (2030+) are more likely.
-Grok
youtube
AI Governance
2025-09-04T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxidNuvGiYMFPJUGAh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwRYGSYytgfHaV9XP54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwsi0Q7BxW0KNjGWpZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx7LBtFEwwmLIGP_Vd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugw3bZaOdLX7hVjInRR4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx0Pmi1IEdemVI3HMl4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwDEUSNpR64tWP95qF4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz6Q7JSlghXDRbyVvR4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmeCR2uLr3x3TpCeJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn0tSTlLBuUkuTcnZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]