Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
That’s just a shallow way of thinking. Humans do essentially the same thing that…
ytr_UgyvopTA8…
G
"One of the biggest problems we have is... finding arms and legs and lower torso…
ytc_UgwKjITVu…
G
The tesla has the right of way became the tesla does not have a yield or stop si…
ytr_UgwanavFK…
G
Because there is no real such thing called AI, it is just LLM. Replacing devs wi…
ytc_Ugyig8i9t…
G
Basically, the robot was not programmed to be suspended by its shoulders. One li…
ytc_UgwL5UyO2…
G
It’s hilarious how ppl are acting like chatgpt is some all knowing being but eve…
ytc_UgwJQQr6U…
G
All these ai tech bros clearly want to make pieces of art, but they believe that…
ytc_UgwxH1gLq…
G
I've struggled to articulate my feelings about AI for a while, but I fully agree…
ytc_Ugydv_xPC…
Comment
Companies won't even pay people to be good at their jobs, they'll fire them if their talent becomes inconvenient, or if their awareness of more than the compartment of the business they are a part of grows too large, or, God forbid, they do anything like calling out a societal ill being propagated by the company.
They already don't want educated, wise, capable, competent critical thinkers. You can tell, because they will unironically post job vacancies aimed at graduates with stipulations like "at least five years industry experience" for entry positions.
AI in a world run for the good of every person on it is fine. AI in a world run for the benefit of about 5000 people out of 8 Billion, the rest of whom are exploited by those few thousand, is dystopian and can never be anything else.
youtube
AI Moral Status
2025-10-30T19:3…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | virtue |
| Policy | liability |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7To3N3bTqWHRXAWd4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzg3My9h6MiHmdkDD54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgzS6P_qp6JJzzMBB394AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzLgdhp4_xZ5n82po54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_UgxMJlOHwQNVVDW5kz14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwMu7jkPZ781oZvapV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_Ugxo6c3EvZkZGen8eaN4AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugz0MG1VkiFCZxQxg794AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugx3nSuDFDjpcBaDBdF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzUrlFSrmKEOxF9n-N4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}
]