Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I mean that's sort of like saying that your whole problem with Hitler was him be…
rdc_lr4oqar
G
It’s not AI. You spent your college years being an activist and a professional a…
ytc_UgxPHkOli…
G
Will we be just a stepping stone to transition to a new form of life.
Humans had…
ytc_UgziOuzox…
G
Any computer program cannot be “forced” to answer what you want it to. The guy w…
ytc_UgyWEp7-u…
G
"There's no smudge or blur tool in real life." Yes there is??? Many tools, actua…
ytc_UgxHOkL_-…
G
@mintkd2294 Thank you for commenting! Robot pain is definitely not fair, but may…
ytr_Ugw4b02pb…
G
@TheSuperRatt I don't personally believe that LLMs are capable of becoming intel…
ytr_UgwPQl4Xo…
G
I think the people who make AI should make AI to replace themselves in society.
…
ytc_UgyWtctIf…
Comment
Let's think back to JRE with musk when he first showed up and was talking about the danger of AI, and how we're not going to have a choice but to institute a basic standard income for everyone. After those dots were connected, Elon stopped trying to raise the flags on AI and went all in on it, bought twitter and created grok.
The take-away is he could see there was NO stopping AI, and he could either get in on it or loose out on it. The tech billionaires are going to replace people with AI, and eventually a basic standard income will be required to sustain. There will be those on the top, and those on the bottom. The future sucks.
youtube
AI Jobs
2025-10-29T14:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugx1pPv4yafH6CofF7d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxnnUWtroq157wcyt94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxWzOPMnID1_zbO63V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugx526klvjgxax7YPVB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"},
{"id":"ytc_UgyPPbZn6Uat0kEL6v14AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxIzwrchxHdWJOVA0F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"mixed"},
{"id":"ytc_UgyzT1U-woWvndx7JIV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzCNR4VcjxRqoQWZB54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxK4OinINO4wB1omZd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugy1F8vHGFcr1xET4-J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]