Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
been using GPTHuman AI for a while now, and it really helps content feel more na…
ytc_UgwjoVTFM…
G
The only way to "deal with" deep fake porn the way they want is to outlaw collag…
ytc_UgzcLQ-Fm…
G
If so many people are loosing their jobs, than who is going to pay their income?…
ytc_UgwIeb3Z6…
G
Today’s AI models absolutely do not understand humor. That’s an extreme exaggera…
ytc_UgzEeoWCw…
G
From my perspective, I think there are still a ton of jobs that can't be replace…
ytc_UgzrLIUkf…
G
They building an interest none of them on ground and making use .. all Marketing…
ytc_Ugx0ocVrp…
G
Big Tech Companies: “you guys, AI is absolutely NOT coming for your jobs”
[proc…
ytc_Ugy3aJ0tE…
G
for me thats all BS, 1.earth never been our property thinking it is ours its jus…
ytc_UgzbSW5ZJ…
Comment
Prediction of Dr. Roman Yampolskiy is indeed provocative and is on the extreme side of the AI thinking spectrum. To see if this will really happen, we need to dissect the argument with a bit of healthy skepticism and a broader economic perspective.
The following is an analysis of the prediction and the fate of precious metals:
1. Will The AI Prediction Happen?
Technically, the direction mentioned by Dr. Roman is indeed possible, but the time scale is highly debatable.
• AGI in 2027: This is a very aggressive prediction. Although the progress of AI is exponential, many researchers argue that we still need a fundamental breakthrough in terms of awareness or understanding of the real-world context that the current language model does not yet have.
• Unemployment 99%: History shows that technology usually replaces tasks, not all jobs. When the steam engine appeared, muscle jobs disappeared, but millions of new jobs appeared in the industrial and service sectors. The challenge this time is that AI targets intelligence. However, the transition to 99% unemployment in 5-10 years is logistically very difficult because regulation, cultural adoption, and government bureaucracy are usually much slower than the technology itself.
2. The Fate of Gold and Precious Metals
Argument of Dr. Roman that gold will lose its value because AI can "make" gold or mine asteroids has several major obstacles:
Physical and Economic Obstacles
Even though AI has become very intelligent, it is still bound by the laws of physics.
• Nuclear Transmutation: In theory, we can make gold from other elements using particle accelerators, but the energy is very expensive. Until now, the cost of making one gram of gold synthetically is much more expensive than the price of gold in the market.
• Asteroid mining: This is an exciting future prospect, but the physical infrastructure (rockets, mining robots, space logistics) takes decades to become cost-efficient.
Gold as the "Floor" of Security
If the worst scenario of AI occurs (the chaos of the digital system), Bitcoin will be difficult to access without electricity or stable internet. In total crisis conditions:
• Gold is a physical asset: It does not need electricity, cannot be hacked, and has a value history of 5,000 years.
• Human Psychology: Humans tend to believe in something that can be held when the digital system feels uncertain.
3. Strategy Diversity: Dividends and Real Assets
For those who observe the capital market, relying on only one type of asset (only Bitcoin or only Gold) is often risky.
• Productive Assets: Company stocks that have strong fundamentals and regularly distribute dividends still have value because they have real assets (factory, infrastructure, land) and cash flow. AI may change the way companies work, but humans will still need food, shelter, and energy.
• Scarcity vs. Uses: Gold is not only valuable because it is rare, but also because of its usefulness in the electronic and medical industries—which will be increasingly needed in the AI era.
youtube
AI Governance
2026-04-21T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugz38PS9yoUHfmOaBtF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxvMj15IjeLWirBYtF4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz13EyNZZOmtyG9_fN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugw0PHyQdS3S62n5ya94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxJBmwDpIA40sME6H54AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwYtxs9WIK-CYOxW6N4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBTy-OL7E6CGD0R2x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxcCFFBLNaRs6SfzJ14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwoa83uR4L4psHRABt4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugya_3Sfjdg-aAYE14N4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}]