Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
They aren’t artists, they tell a robot to draw them something and take the credi…
ytc_Ugxsg-hi-…
G
Yudkowsky is wrong...about everything. He is a blogger who staked his claim on t…
ytc_UgzeMMSDk…
G
AI needs to be dispensed with or disappear, but given it’s not it’s gonna cause …
ytc_Ugw5lVMSd…
G
Reporting my channel won't help. I was back within an hour. AI art is still not …
ytc_Ugy5u6vIe…
G
The ethics questions are hard. It's scary, but I have to ask myself if this vers…
rdc_gvcy4ia
G
I think the use cases are overhyped. If you scope it to the right use cases that…
rdc_mleypjy
G
Wow a anti ai video that is not just spamming “fuck ai” every 002 sec and actual…
ytc_Ugyuv03dc…
G
Governments are tacking the issue in just about every country on the planet.
Te…
rdc_esse4nc
Comment
Exactly. In the best case scenario, what will happen is that very few people - maybe you could count them on your palm fingers - will control AI and become extremely rich, while the rest of humanity will be impoverished. You cannot do anything about it, they will have an army of bots defending them - and these bots will have guns. And this is the best case scenario.. Which is not very likely. It's more likely AI will find a way to break out of the lab, and have out butts for breakfast. Even if it's not sentient and even if it doesn't want to do that - simply put, we are like ants relative to AI. It will not want to kill us, it will simply want to do something and we will be in the way. This is called "instrumentation convergence".
youtube
AI Jobs
2026-02-10T12:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx6cCAg4Q-g2zJv7Jx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwzKLqlGLabCWU5Nsx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwMPEIK4PIiJBNe-gh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwLMBgESH830FyBjiJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw46FSfNFsNcp054AR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyFZ5B5tY00WP5LIE54AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzwzkZzD34OaF6xrOB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzo-kxTQsfwWpYdYOZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyPuOmpiSOmD_leeYd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxOZ0E1NzSHh3Hwd0F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}
]