Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Not so fake now that Optimus “we robot” 🤖 was released 😳, wake up people this is…
ytc_UgzxnmnLm…
G
"Don't they have to prove" - Ai bros trying to defend stealing copyrighted mater…
ytr_UgyCr_Jm3…
G
It only took a couple months and we have an Ai that lied to a human to hire them…
ytc_UgxMYCk1f…
G
selfdriving cars is an insane idea and a clear indication of the dystopia we are…
ytc_Ugx7OVz8-…
G
Don't you see anything done is done for the benefit of corporations thus capital…
ytc_UgwMfxeSH…
G
You're just giving AI psychopaths a new thing to train AI on!
Seriously AI bros …
ytc_Ugxbu2qY5…
G
I'm waiting for the day where AI teaches us the secrets to eating gas station fo…
ytc_Ugwc1feHv…
G
I hate to say it but Charlie is completely wrong. Copyrighting an AI piece is wo…
ytc_UgwPBNFes…
Comment
I am definitely concerned about AI supremacy and being wiped out in a Terminator/Skynet scenario, but my more immediate concern is the economic ramifications of AI. Basically I believe that the world will be divided into to classes that are so far apart that a single person in the top 0.1% owns as much wealth as the entire bottom 98% combined. Basically a thousand people will own the entire world between them.
We will end up with a corrupt and imbalanced system that makes the worst civilizations in human history look positively utopian.
youtube
AI Governance
2025-06-24T13:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyolKgzen8ewYmRVg14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy6alxdRnqQ1YvAk9F4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugz-2CcJGtGyGMNVgXZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwerruDXJiXyR6nTEF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugzxjqv5GYYxWjLswZN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyGjyK9dW5IcR3nRrt4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzknxbakj5ngyG4oOx4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyvqZi5XV3wEAE2CU94AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy674Yux2-5xrsDW254AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyUyZ5dL--3vEmddFR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]