Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Okay, so that means I should stop cursing out my chatbot...? Heh!
This is goin…
ytc_UgyElpag9…
G
Devastating! Open AI is a ticking time-bomb, soon we'll have no control over the…
ytc_UgyGa6QfI…
G
The thing that worries me (as an educator) is that ChatGPT makes it more difficu…
ytc_Ugx4gAU7c…
G
There this little thing called *practice*, because nobody starts out amazing at …
ytc_UgxRAmz8o…
G
Oh hey, you are using a comparison that doesn't work in the slightest. Cars work…
ytr_UgzQY-i6A…
G
AI is going to kill Google like Google killed the Yellow pages. What did anyone …
rdc_m27ymg1
G
45:23 I feel like the guest is personifying the computer software too much. It's…
ytc_UgzFY7J9Q…
G
Can't we go over this? AI will be very smart but they will always be just machin…
ytc_Ugw1r5uas…
Comment
The "AI problem" is already being corrected by the market. AI has perturbed the marketplace enough to cause consumer spending to slow down, which matters much more in the long run. I predict that the present-day rapid adoption of AI is sprinting toward a massive market correction in 2026/2027, and the value of Nvidia, OpenAI, Microsoft, etc, will drop radically. It is a delusion to put AI in front of the economy. We'll probably start to see social movements against AI next year, 2026.
youtube
AI Governance
2025-10-30T02:4…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgylNNNcXdI8EIYG3bh4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzp78buOGTgqxQ3P9h4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugx4WjAdlyPCRuKz2-l4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyImTagFpAxOxZyrCN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTja9qPpQHQUBc-NV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzSIE285aOi5rawTpx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzxAzkYe9BSejxDSMJ4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz6m508lzq6vHXo1i14AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyo5NlP_NsLeji5jVV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzOyxGhkwMjEMPqhT14AaABAg","responsibility":"market","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]