Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
An unscrupulous person would make an Ai conspiracy slop to convince the “trust m…
ytc_Ugzq43qO8…
G
Look I disagree with Neil on AI ... he talks about "automatic" braking . The con…
ytc_UgzwJhd0q…
G
Regulations should be in place, and it seems like a good idea to do it on state …
ytc_UgzBi7P4n…
G
@IsaacArellano1977 You're right. Havine an AI girlfriend IS inded much too advan…
ytr_UgxKnYav0…
G
Replacing humans with AI to cut expenses because they weren't "efficient" enough…
ytc_UgxLdgaIR…
G
At this point in time, I find it jaw dropping how much academics are ready to be…
ytc_Ugx-vK2QN…
G
And what if you live in a thrid world country and cannot afford AI? And what if…
ytc_Ugwg8lViO…
G
I don’t really have anything else to say that hasn’t already been said, so I’m j…
ytr_UgweIHian…
Comment
The wealth of a country is how much this country produces or more general how much benefit it creates. If AI takes over jobs, it’s not like the wealth decreased. Its just that the wealth redistributed.
With that in mind this isn’t a problem about people not having a job, it’s not about AI. It’s about the same old question people have asked themselves since the Industrial Revolution: how do we distribute the wealth we create in a fair way?
Sadly you are right, exploitative capitalism is so engrained into the country and society, that it will likely end like you say. Some people who own the AI will get incredibly rich by benefiting from it and the rest will starve. But there is hope: people still need to buy all those things that AI produces. So you have to give them some money at least.
youtube
AI Jobs
2025-08-29T03:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzAmt4LqyU_yqviDqt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"},
{"id":"ytc_UgztzfnnxltMpd3m1-N4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxfKwPgyPXR2ShPSGV4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzDga93NBlXv43UhUp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyz_ihm0BGZRlcexX54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx2oDY4uReE8QsUoFp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugz_2FJzHztCtjAYBRh4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_Ugykj2gU-jSrIpkl9SZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"confusion"},
{"id":"ytc_UgzlU5TqUSwLUyd02Bx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyOLMLkwQOs7hkH4VF4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"fear"}
]