Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Actually, it's OpenAI itself that's f*cked. OpenAI owes over a Trillion in finan…
ytc_UgxZaVbAO…
G
If you learn your company is laying people off for AI, quit that company. Fuk th…
ytc_Ugxalg1HS…
G
This is entirely unacceptable. Just looking at some of the replies on 9:43 is di…
ytc_Ugz4qrcGW…
G
Making AI art for yourself is all good and fine. But posting it for likes and at…
ytc_UgxCD5bNE…
G
AI is most likely to be an amoral psychopath, and that is how it's turning out.…
ytc_Ugzou0kMo…
G
Bernie really needs to shake up his “billionaires bad” playbook. It’s tired. ALL…
ytc_UgxgUEw1d…
G
I think humanity is making a huge mistake, something big is going to have to hap…
ytr_Ugy878jXT…
G
Ai is killing our planet delete your channel right know f you that's disgusti…
ytc_UgyiklvUi…
Comment
Most of the things he says will not happen or will take much more time to accomplish:
1. AGI is much harder to do than he imagines. Scaling existing techniques will not lead us toward AGI.
2. UBI doesent work. Its like communism which failed in the 20 the century. AI and AGI will NOT enable UBI! Merit based societys will always be more efficient and bring more prosperity.
AGI is like fusion power it will always be right around corner and then takes at least 5 to 10 year if not more! Mark my words and remindme! in 10 years.
youtube
AI Jobs
2025-06-19T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgzLICWFpuXWItpE7uJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxGk2hj8AsIPjJ2GfZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwm1VRVWVuwomRZjrt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgyXTY_M1K2BUAq42rh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzXOb6G055C7X3UUe94AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzXUaoo7BPEEtJ5CnZ4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxvztZytD3wx0dYpOd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTgze5Tw-4iz4TJJ94AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugxa8jw0B56-zq6tcol4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgxiNa2lHp9DK_252B14AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]