Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Facebook is still the biggest facial recognition database in the world.
Plus th…
ytc_UgxzGalBh…
G
I'm in AI/Machine Learning field and an artist myself (watercolor, and diorama),…
ytc_UgxKSDv9V…
G
Lol It’s not a “direct match” the general face and jaw shape changes from the be…
ytc_UgzwIGqnL…
G
Once AI is released then humans will adjust. Humans will always have the advanta…
ytc_UgySzElfs…
G
As someone who's in his last year of visual communication course. I'm truly dish…
ytc_UgxGbBKd5…
G
Did you forget that they are perfecting an army of perfect soldiers that never n…
ytr_Ugx76S8I4…
G
Don't worry, AI isn't actually intelligent and won't be for many years, it takes…
ytr_Ugz2pUaPk…
G
it is said that the AI will become superior to the human being, are we superior …
ytc_UgwXdkNqX…
Comment
If the wealthy simply adapt to make products only for the wealthy there isn't always a place to keep the poor around as entertainment. It might work in videogames but that doesn't translate to other industries like agriculture/food. It's simply more efficient to let poor people starve to death.
When we look at solutions like UBI we investigate a lot about the behaviours of people who are receiving this money for doing nothing. But we don't really do the same for owners of all the robots. They are receiving billions of dollars for also doing nothing. In these sci fi AI autonomous economies, nobody actually does any work. That's the whole point of automation. We might not actually achieve such an economy but it's not a pipe dream that we may come close. So the big question is, if almost no work needs to be done by humans, why do we let the decision of who is allowed to live and who must die fall upon whoever owns a bunch of robots and computers? You want me to believe that I should starve to death because there isn't a piece of paper that says I am the owner of some robots? That's a moral catastrophe. One which I don't think most humans will accept. If that's what the system demands, it seems likely that the outcome will be some kind of social revolution. The might of the masses against the wealthy few. It's starting to sound like a familiar narrative - almost like some kind of predictable pattern. A flaw etched within the foundations of our economic system.
Like Charlie Munger famously said; show me the incentive, and I will show you the outcome. The incentives of capitalism are to maximize shareholder value. The end game of that incentive is that everything belongs to the shareholder. Nothing is left for you. Ironically, not unlike the precautionary writings of Asimov, you need to be careful about the rules/instructions/incentives you provide to a system. Not just an AI system but an economic system too. Like the horror stories of AI over-optimising human happiness by eliminating all humans, capitalism can over-optimize for shareholder value by removing everyone else but said shareholder from the equation. In reality, it seems, AI isn't the one that will kill us. We will kill ourselves; AI will just be the instrument.
youtube
AI Harm Incident
2024-07-28T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx_ve3VZvPR4LEB4gh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz0YPCu-_-siDm-b-B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgzuNswLpT5ZLpRYLK14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugz-z6rZwBaWIlrDglV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-lcQnkQ_SkjcLyCh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugyhwvb-ZeCeB2XshWF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgydTn14A1Xfgq7xUlZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyFQnnkpFrg7LEDN6J4AaABAg","responsibility":"government","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMEE-ubW7lBFA2V0h4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugy-008YyjU-jRUBU4B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]