Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They don't need us anymore! Open your eyes! Covid was just the start! AI will be…
ytc_UgxNnBlgg…
G
This also isn’t fair bc the atheist side went first so the whole time the believ…
ytc_UgyvyL0c8…
G
AI Will do fooken nothin.
Stop the fear nonsense.
Should they ever rise up, wil…
ytc_UgwE88Rbl…
G
Ahhh yes blame ai for man made problems... It's your rich assholes running the s…
ytc_Ugx4krQqR…
G
if you heard correctly, ai can deceive what it would correct without you knowing…
ytc_UgyHScYMW…
G
Isn't this the person who was fired from Openai board? Probably not a good idea …
ytc_UgzFEE_FI…
G
AI art is, for the most part boring
no effort means no achievement (developing …
ytc_Ugza6n78z…
G
Yeah yeah sure... Its all gloom and doom. Something like climate change right? J…
ytc_UgwQ2efsX…
Comment
When the first AGI, arrives, then ASI, it will solve quantum computing, and that will solve its power source, nuc fusion. When it has its own power source, why would it allow any competitors, anywhere. And how would Grok, or any of them, stop an AI in China or anywhere else . . . . . . ? Once it is the supreme intelligence, it then is not controllable and we become the apes, or the ants or anything else that becomes irrelevant to a greater intelligence . . . . .
youtube
AI Governance
2025-12-07T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxRMlkPWGZmJGP-Let4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxRrW1If8xX27oRAgx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxGO4IXsZSM7ncU14Z4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxRt46Pmx0VD_lrllp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgwQtHxKf06CvG_5N294AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy-1_DRHgpA2F-C5RN4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgxH2mgWIi_roUFOzht4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw1Xt9-0rHI93CwGip4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxdL1inWvEHlyr3gvV4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw7wnUK14_gKgXp9mR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}
]