Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Ai art is and should only ever be for personal and use like Dnd portraits for ch…
ytc_Ugwav78Z_…
G
This guy doesn't get it. When wooden rubber tires were invented, horse drawn car…
ytc_Ugyyz0QsL…
G
If you train an AI on every creative thought from humans, it will recite them al…
ytc_UgxHu__-L…
G
It's mass thought that allows AI to progress intellectually. Whether good or bad…
ytc_Ugy_S0WD-…
G
Elon has clarified this multiple times, but let me reiterate it here. The decisi…
ytc_Ugzd1uiAK…
G
Stop supporting companies with A.I platforms, no tax credits for A.I. agents onl…
ytc_Ugxx6C_De…
G
You bring up some very good questions there. The answers as to what governments …
ytr_UgzVWQLhT…
G
I swear if they make a fucking movie using AI or making animations using AI. 💀…
ytc_UgxFdlfV0…
Comment
This theory doesn't seem realistic to me in that proportion, even for operational jobs. The reason is simple: the production line and technological scalability ultimately aim to sell technology to users and companies that develop other products to be purchased by the population across various social levels. If AI breaks the economic cycle, it breaks the very reason for its existence. There is no way to concentrate capital among so few people without the resulting level of social chaos leading to political interventions.
That is where the importance of state intervention lies—in not allowing this to happen. The risk here is that these interventions might follow the same model as free-market capitalism, where underdeveloped countries suffer the full brunt of this process while global economic hubs hoard all the wealth and point an atomic bomb at underdeveloped nations, creating a form of neo-slavery on a massive scale.
youtube
AI Governance
2026-04-08T16:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxkR8kFk4g8AQKPvWZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwGiNK-UTcRcRWJBbB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwyCqQP9s8_RYnDmWN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwKiBUQdX39BMGPiYd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzKBSAbqr-SVR1g_oF4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgylRDFlElEh3ZelIQx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzrnTO1DdGQ8RwNYu94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgypnXgcDN5HfavtrCN4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugy932JTkF5IbMs3DQV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyghNaZ_KsnItaJNbt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}
]