Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I was literally told by an AI bro that professional artists don't care about AI.…
ytc_UgwN9gxGa…
G
Why does the chat bot do this, does the program get money, I cannot understand w…
ytc_UgxOHN5Vb…
G
Though if Waymo can drive unto parade route its either they didn’t seal off the …
ytc_Ugzl_pI3K…
G
First they lost the jobs in their home country to workers from India, now they l…
ytc_Ugz-UUXvx…
G
I think all of those conflicts are dumb ai is a tool a tool that wouldnt go away…
ytc_UgzdIv3Yl…
G
Ah yes, how AI promter tried to defend themselfs in saying they didn't steal art…
ytc_Ugwc4pnpn…
G
I think the biggest problem with this is, let’s say all art and photos and every…
ytc_UgwxxQ9vl…
G
To be able to regulate AI you have to be able to get an agreement with every sin…
ytc_UgyfakCVk…
Comment
How could AI respect a humanity that has allowed half its global population to mature to adulthood with a mean IQ of 90? That’s not a failure of individuals, even though leaded gasoline had shaved off billions of IQ points globally, and the mass of our brain has decreased by 10%, over the last 10,000 years. It’s an abject failure of humanity in an emotional state, ingrained inside us, aggravated by 80 generations of relentless meanness, cruelty, murder, rape and injustice. We can do better than this. Indeed we need to if we’re to survive ourselves and AI.
youtube
AI Jobs
2025-11-05T18:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | virtue |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxNVymkqjlG0avlpSl4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzKNbL7g6WhQnAwKgR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugz-qWV2rxj3hAtq80l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugy8brb1jtOSC7naXuR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7IfkNX1E2-wWmSIV4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"resignation"},
{"id":"ytc_UgzDXJxsu7yyVsxx0u94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugz_l8j5SrwkRiKFQDR4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugy1sBK_Litwi_vmXj14AaABAg","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugwc6BrxStYm_5BSl6J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzptuLf9gHUbdRRqYd4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}
]