Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The office of 20 or 30 years from now will be as different as the 1970s or 1980s…
ytc_UgwNDBMSo…
G
These guys creating this Ai bs don't even know they're making the thing that wil…
ytc_UgzG4-mZT…
G
It's kind of like we built the ultimate test for ourselves. Maybe this is why th…
ytc_Ugzw9fBF-…
G
boilerplate is still being pushed. When it's time to go to prod it gets quiet. B…
ytc_UgzaNZOLm…
G
The principled reason that AI can never be conscious is the question.
Right now…
ytc_Ugy8EVyq9…
G
Bro while my art gets 10 views and 2 likes this guy calls himself an artist and …
ytc_UgytzVszs…
G
The hubris of those worrying about AI that patterns itself after human behavior.…
ytc_Ugwa597GR…
G
As someone who is still learning about programming, this really made me question…
ytc_Ugy-AuNcf…
Comment
Ai creators are too biased, too left-brained, and too in love with their own creations to offer credible discussion on this topic. AI is an automation tool and nothing more. Amazing, yes, but not a living, breathing human being by any stretch of the imagination. A computer brain is not a human brain, no matter how "smart" it may seem. The same scares have been perpetuated since the invention of the wheel. It's like talking to the engineer who invented digital photography about the future of cinema, as a whole. Technology is just one aspect, one dimension, and that's all the creators see.
youtube
AI Governance
2025-08-30T21:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxAj574KekXhU9I6Hp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgztRC18FANcZbR0Be54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_Ugz0t7tuBbyFuqPgp614AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxeZXqBIYUIqhlFQ7h4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgweOBO5lJ0kPQZ2Wx94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgxFJhD6xeeXLFeWBil4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgzPjAbQravsxOgsKOx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzS2vnHvklgpb7-6p94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzPuoWt5S5I0KtKD6V4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwkvaIPrf5Eey9wwgd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}
]