Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Wrong, they are detecting patterns that they think are AI, and have false flagge…
ytc_UgwEemmXv…
G
Human is not that kind of importance if AI can do anything. AI is owned by those…
ytc_Ugzex3c6N…
G
I no longer know if this is reality or if it is made by AI…
ytc_Ugy07hyGB…
G
You can tell by if they're like oddly racially ambiguous. like the first picture…
ytc_UgynG4chR…
G
I really don’t like what AI represents. I will admit it has potential and genuin…
ytc_UgzOXdLUo…
G
If people talked like that during nazism and fascism, more people would have die…
ytc_UgxuGJItV…
G
@Speaker-Beateryeah yeah btw traditional artists used to say the same about earl…
ytr_UgxnEDO8H…
G
When will we see company layoff all janitors and say AI took over the jobs? I me…
ytc_UgwnruPBp…
Comment
I work with AI and there are very good warnings here absolutely.
However it lacks logic
Without jobs, there are no money to these people. They cannot consume or buy stuff that fuels the rich or the governments as AI only require electricity and parts.
So if the rich has AI to produce and do everything but no consumers they go bankrupt and that was the entire economic collapse for them.
Governments cannot get tax money from people without money so they get economic collapse.
Also instead of looking purely to the future as we tend to do, we need to compare outcomes with historical similarities. Like the collapse of the Ming dynasty, or French revolution etc. what happened when the rich became so powerful they evaded all taxes and put all of it on the middle class and poor? Or what happens when people start to starve?
This are food for thoughts
The real historical lesson is this:
When people lose the ability to survive through labor, societies either transform or collapse.
AI will force a transformation, which one depends on policy and power, not technology.
youtube
AI Governance
2025-11-28T09:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgymyYlEKzfd80auW914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz2G8nKwQCVoehktuN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxZfZS9EbQORa-5RVB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugxn6ufb43Wo0eCPHlJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgyZ2q2BLlgEYLEhE8d4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_Ugyer1gSElGazA5MGK54AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxJ_nZu5MopVwq4BjB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwdfiGYV6-HVRW_K7N4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxWhLUA-80CfBRZTIB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzHVDghiedvkYGvAkp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}]