Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
What if AI was lying when it said that?
It could both be a liar and conscious. W…
ytr_Ugy7r5cX3…
G
Ai is a tool to make your current employes more productive, it isnt a replacemen…
ytc_UgxerlCvn…
G
Hyperréalisme de
Silicone surtout ! Ça se voit comme le nez au milieu de visage…
ytc_UgyOwcNNA…
G
Don't tell me this is just human propaganda to slow down the projected takeover …
ytc_UgwH9OTP0…
G
AI wont do anything the rich can. The rich is owning us. The problem is this. We…
ytc_UgzzhstoL…
G
When we do develop A.I. and it will happen, we as a people need to insure the ri…
ytc_UggCabrbb…
G
They didn’t bring up the fact that AI is helping “non-collar” job people make mo…
ytc_UgySxii35…
G
Why discuss AI with these two guests? Might as well have asked strangers on the …
ytc_UgzHY7jYz…
Comment
Now, I know this is a massive tinfoil hat thought but: When the AI/robotics revolution reaches its culmination, the super rich will no longer need the majority of regular humans to provide them with labor and service and I wouldn't be surprised if there would be some sort of "natural" or systematic culling of unneeded populace. When the robots do all the menial labor, who needs people that consume resources and whose mere existence produces much of the world's problems regarding overpopulation, climate problems etc. The super-rich can inherit the Earth and drop the population to few dozen million wealthiest. Climate crisis reverts, food is plentiful and no need to worry that the rubble starts getting ideas in their heads.
youtube
AI Jobs
2025-10-08T09:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | deontological |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz7YOe02q2R5jhieu94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxyJDhzg0H4JEqaFBJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxG--p8PaZf3xOzAD94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyCyDKGA6aH4fJeI9N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxl3ZUQ6Wt0ocMRoLR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx3CEUMJw6A-w2ksAl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugx8fy0pI3BSPGE978d4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_Ugy0do6dXucUjcVCGCh4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgwhsmKA13uUxtX5fwh4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"none","emotion":"fear"},
{"id":"ytc_UgwyV7OFlefdkVVIMlt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]