Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Bro died cause of a robot LMAOOOOOO. Omg he was 23 too?!?! God damn, I could see…
ytc_Ugwl7-xnC…
G
So you wanna get all these robots 🤖 to handle almost everything let’s say you ma…
ytc_Ugx3k2CUs…
G
Ai genuinely has zero benefit, I agree with bro, huck it into the Sun 🫡 Technolo…
ytc_Ugw2LfcyZ…
G
Imagine the day when you can put your hand in a scanner or qtip your cheek and g…
rdc_f1emf6r
G
Similarly, I heard of efforts to estimate chances of short term survival for tra…
rdc_f1eirvp
G
If you relay on AI on all jobs and as the speaker said 99% unemployment rate, th…
ytc_UgyGh0Q_9…
G
I’m pretty sure I don’t want a Chinese robot loaded with spyware in the surveill…
ytc_UgwDuG0Z2…
G
Yampolskiy's academic credentials are not strong for what he's claiming.
PhD f…
ytc_UgxTt_woL…
Comment
They all forget an important step. For all those great things to happen, humanoid robots, job automation etc., we need a massive amount of resources. Whoever controls the resources is in power as well and the economic pressure through automation, will drive the resource prices to infinity.
This will 100 percent trigger a weaponised conflict.
So an AGI powered full scale war over resources will come first.
This’ll be another Oppenheimer moment. The race already started and whoever will be first, will also win this war.
The future isn’t looking too bright, we can just pray that’ll still be alive to see the good AI could bring.
youtube
AI Governance
2025-09-09T08:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxFfdZTgu7f0_eGX6J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugxbio8D48And2QFCL14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxBqbDnwmjiTqVoTFh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwaZTdskMVvt1MzbNV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxIHWiBEENuB3Am-nZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyJUmB1RFimh0Tf9M14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxlTJWsnKD4IzLINch4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugwp9BAkE7tel2UTcyh4AaABAg","responsibility":"user","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxoxEyENeNrYdltMQR4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwKucd3vZmZqrIYixp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]