Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
"I don't like how AI steals data but it's undoubtedly a new medium of art!" and …
ytc_UgwHJQqj0…
G
Unless an AI wipes us out with a nuclear war or a virus, nothing will change muc…
ytc_UgwVA4X_d…
G
This is why I stopped commissioning art. There is no point anymore. No matter ho…
ytc_UgyurFMA4…
G
@francismarion6400 It's so easy to dismiss the harm and strife caused by data ce…
ytr_Ugx2AtRWV…
G
Humanity needs a physical representation of God on Earth.
We need to be led by s…
ytc_UgxQv6Lfg…
G
I get what you mean! The design choices for robots like Sophia can be pretty pol…
ytr_UgxCjDJ24…
G
Agreed this is truly not the way life should be going, a robot shouldn't be givi…
ytr_Ugyxn9rea…
G
Before watching the video, thank you for referring to them as AI users and not A…
ytc_Ugx1Mbbo9…
Comment
Well fucking said. The billionaires who are openly trying to “automate all valuable labor” imagine a world where it’s just a few of them, their legions of robot laborers, and no other humans at all. They’re not going to take care of us in the WALL*E sense, they’re going to “take care of us” in the Terminator sense, so the entire world can be their playground. They’re making a huge assumption though that a super intelligent AI will see a reason to keep billionaires around, as if Peter Thiel and Elon Musk are just SO interesting that the AI is gonna want to talk to them all the time.
youtube
AI Jobs
2025-08-29T12:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzkbG4GQVFffR-Wq9t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugw1zU1vZv5FWxLMR0p4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzC0tsJxvpQ5AWaZyl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwL8uTf3Ku6bRXKu7B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"},
{"id":"ytc_UgyNrk0KsZ6Y8_ii5Nd4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugz0XiocMDFyBe5STpN4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwDgRNiqZbi71Vliyp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzCTf9Hb8gJMKVk9o94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzbWnFkUKBxRUug4jJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwYGON0z9nTVfP6Okd4AaABAg","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}
]