Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Tesla follows the Vision only approach. They dont have any other Sensors to supp…
ytc_UgzN__OAZ…
G
@lolbitninjagostudioslncust3345 It's more like if you ordered a gigantic pre-bui…
ytr_Ugx8yZO46…
G
I’m just kinda sad watching all of this. I have met a few people who genuinely p…
ytc_UgyPnGnr5…
G
We all getting forced fed that AI bs while it is 99% just silly and useless.…
ytr_UgwVbs3Gl…
G
Ai is used in the military, which within its self is doom and gloom, its once ai…
ytc_Ugxxg1eNN…
G
Ai is good it helps hospitals, Netflix uses Ai, a lot of people use Ai because i…
ytc_UgwXQbf-I…
G
The dilemma is between having the complexity in the shelving system or in the ro…
ytr_UgzoPMpOl…
G
Abundance for who? You're assuming AI productivity gains will be shared, but tha…
ytc_UgzeUr-9t…
Comment
The end of scarcity. Cheapest possible manufacturing and supply chains. You won’t need 50 car companies worldwide—only 3 and 20 factories needed. 90% unemployment. All these disengaged people will need to be controlled—leaving them free with unmonitored access to AI superintelligence will be too dangerous. Control of the AI and the people using it will be necessary because of the power of these tools. Businesses will be monopolies, but charge very low prices, and people will be granted the right to certain societally accepted levels of goods and services; the elites will still be there, controlling it all and exempted from the mass “limits” imposed on the masses.
I turn to my digital assistant and ask her, please administer my daily-permitted dose of Soma, please.” All is fine.
youtube
AI Jobs
2025-11-18T21:5…
♥ 6
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgwDkgypA37pPVmVc254AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugy5ONqZschTvr7G-IR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyhHHC4Jnrd3KgUQQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyEEzt4mgk_dkqgFQV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxlFqZLsSDFh-kmXY14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgzaOM3u1iB2MOYnbXJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"ban","emotion":"mixed"},
{"id":"ytc_UgwntoffBryUJF2IVmx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgySFmcwnsvMV6MYIxV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwqMA_73C296IoNXrZ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgybT96Yv3NtyH0jc-F4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}
]