Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Its a hellova driverless 80.000 lb Dirty Bomb delivery system ... Domestic terro…
ytc_Ugx2GRydO…
G
No whats scary about all this is the AI not the fake robots its the AI that fool…
ytc_Ugy7fr8EE…
G
"artist" is a broad term since anything can be considered an art. It can be diff…
ytc_Ugxj2hJ4i…
G
When Socrates criticized the Athenian poets, he said in his judgment that most p…
ytc_Ugxg5zJYw…
G
Here’s the problem with AI lawyers, ignoring the legality. The size, power, and …
ytc_UgzbGgvrA…
G
Okay look I am not an AI guy and most AI art looks rough right now, but I can't …
ytc_Ugzm0VPMk…
G
What nonsense. AI isn't real. ChatGPT is just a giant reddit based bot that sp…
ytc_Ugytlm_7q…
G
Why cant they use AI to do the dehumanizing jobs like cracking cashews that burn…
ytc_UgwYsTrdC…
Comment
In my opinion they are all overestimating what an AI is nowadays.
Currently it's just a software with no curiosity, no willingness to evolve, discover, sense. It requires human input to operate and do things. It's energy-intensive.
AGI is not so easy to build. It's expensive, requires lots of energy and, the most important thing, people are greedy. No one will build it and allow it to evolve freely with no control, because it will not make any money.
And one more thing. If no one will work and no one will earn money, how will those few guys make money? Who will buy their products? How they are going to control people to not allow protesting against them, attacking their robot making factories to stop taking their jobs and freedom? Why the banks, financial institutions and other powers will allow them to destroy their empires?
youtube
AI Governance
2025-09-05T23:3…
♥ 5
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | fear |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugzsz5Gzt-u-AFQr0hR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugz7WtUTS4HPiHVvKlN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyVT9CGzhMWJCFoP-h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxLz1gLjKnRnwe_-RB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgxdO8SIZx2nG13rDIl4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxh8njCqmuK36S3GVl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugyh5jk1WcD8VInut-l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzcKOs7uUDFcYa-a6t4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwcAAlqYt-lP53FS7N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugx3MVPRi_gnqvRlsIN4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"outrage"}]