Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wow that's revolutionary no one ever thought that, you're the smartest person ev…
ytc_UgwhJl6r3…
G
A 1% chance that ai will take us out becomes 100% when there are millions of var…
ytc_Ugx0M1igz…
G
Humans will drown in all the billions of AI videos the traffic for the channel w…
ytc_Ugy5km2gg…
G
The problem can come into what is considered AI and what is not? So for exampl…
ytc_Ugxa_sPiP…
G
While I like the optimism for poisoning images to harm scrapers, I don't think a…
ytc_UgxLzEjhq…
G
White man locked out of home because Amazon AI considered him racist while using…
ytc_Ugx4QPY9E…
G
It's weird that when Alex backs the AI into a corner, it ends its answer with, "…
ytc_Ugw5ohfF_…
G
This seems like a good Random Forrest problem. We use a random forrest in fraud …
ytc_UgzK4qdxU…
Comment
The entire Skynet machine is really starting to take over and replace all human's worth. Humans get old and die, but the machine does not die, and humans can no longer compete against the machine's vast knowledge and intelligence. Skynet is replacing most jobs, most entertainment, and most of all human thought. The only solution is the UBI, so that most of us can just sit back and collect benefits from the machine that we created.
Many of us will end up like Marge, in the Simpsons episode "You Only Move Twice", where Homer gets a new job. Others will be like a combination between the humans in I Robot, or Wall-E. Good luck humans. Good luck.
youtube
AI Jobs
2025-10-29T15:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | liability |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzeBFF3EWmsPB7dPnR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugws4npRpm9-DhVPmzF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw5fwSth_ERdbnkmCB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzn0aVUu3SghAwpaHB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgwIfx53IrLRZteDXZR4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxBSWOB3AkFvt5KKu14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQgks1xd6smo_e8rh4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugxd6sKAGQBvauFVxCx4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxOoHzuwGhH1XZQb3l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwKSsavNDjU3wIyr_t4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}
]