Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The most atrocious thing is that if big suits use AI to make games, it wont make…
rdc_lgrrh3d
G
I would take the risk, if the AI tries to improve our lives it could be a chance…
ytc_Ugz-F-d6P…
G
Regardless if you base it on the brain it cant possible have the capabilities of…
ytc_UgwBNhdvB…
G
You did it totally wrong. This is not how it works. 😂 Ask ChatGPT if Jesus Chris…
ytc_UgzOxJrtz…
G
I still feel AI is hyped. It may catchup now or in next 5 years… and then people…
ytc_Ugxz9zEPx…
G
I robot! This is exactly how it’s gone be in the future! People are dumb…
ytc_Ugw_yyWm6…
G
The problem is the governments are making it less self-sufficient so we have to …
ytc_Ugxgu2ThK…
G
These guys are concerned about "human extinction" resulting from a technology pe…
ytc_UgzlF3fOA…
Comment
With every possible need eventually being met by robots & AI, the top 1% that have been exploiting human labor throughout all of history may no longer have any practical use for the rest of us. We will just be in the way. Will they actually be benevolent enough to share all of these benefits with ALL of the people below them? The study of human nature says, “Mmm…probably not.” But they will most likely need to have other people to control, so that they can feel superior, so they may just keep a few of us around for their personal amusement. It is extremely difficult to realistically imagine a path forward that doesn’t end badly for most of us.
youtube
AI Jobs
2025-12-01T02:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | virtue |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgxPmmG-kRt_DEQJjBJ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_Ugxdt18ncoelSTXmPgx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwDPg8yll6bkmCdFjd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugw3GQpj5ugO0jt0HNB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxmUzOWeL9a3rJmiK94AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_UgwrW6SdDF0MqkjrTPd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzuSHcSBeV7Q-9DoKp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxaGxrmVAYN2Zn3h7B4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzBazJEKzWUA-Oc62x4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyzwxCJOzgoYrOklv14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}
]