Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Art is suppose to be handmade , with love and care, hard work and effort, heart …
ytc_UgxOCjRsk…
G
a whole bunch, which ones specifically? We use Chat GPT, Grok, Deepseek, Gemini…
ytr_UgzkBi2x4…
G
Tucker has Elon and and talks about the dangers to mankind courtesy of AI
Joy R…
ytc_UgyUb2piI…
G
right around 10 minutes in im expecting chatgpt to say "STFU man! OMG you are SO…
ytc_UgyI2H05z…
G
She has some points that there are different kinds of issues in many contexts, b…
ytc_UgxBIHG_g…
G
So like, do these AI bros not realize that most digital artist started on paper …
ytc_UgzaTPWxI…
G
I hate this argument because how fast do you think AI is advancing, how much lon…
ytr_Ugzh3ohDQ…
G
Except this conversation will not have a significant weight in any AI database. …
ytr_Ugwk4ukLJ…
Comment
Let's assume that what some claim is true—that most of the jobs currently performed by humans can be done more efficiently by AI.
If this is what happens in the near future, it means there will be millions, or perhaps billions, of people without a job. If most people no longer have an income, who will buy the goods that companies so willingly sell us?
Will people without an income passively accept starvation? And if no one can buy anything anymore, what's the point of producing anything at all?
In short, either people will be paid to do nothing, or trade will simply cease to exist.
But isn't that what our capitalist society is based on?
Frankly, I find this all rather funny.
Why? Because it seems to me that human society is run by complete idiots who don't know what they're doing.
youtube
AI Governance
2025-12-09T14:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz2aG7N3OMQ3Rntjwh4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgztS0q8_1H6nvUNjLd4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxGTEpfVBbzTPfVp_Z4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwQrQ96usVKwd9P00p4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugzi0cA4lSkk8w_dZox4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTsObAhGsVY2Dk_IJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxpIrXnQ63fSjI-RtJ4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgztULkQzdrjx6GpvZ54AaABAg","responsibility":"none","reasoning":"mixed","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyPTmGp6f8gH9Md3lZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugwu44HtihKP4y8xWRN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]