Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I really hate when pro ai art tell us that we are trying to gate keep art creati…
ytc_UgwJK5A6K…
G
i am not physically disabled, but even putting aside how disabled people can mak…
ytc_UgxBBAHdy…
G
AI generated images reduces art to the final money worth product which goes agai…
ytc_UgxFfTC9p…
G
there are no concerns, AI must just take westerner jobs so we can crush them eas…
ytc_UgwrBvA5m…
G
The image of robots chasing us around and killing us is not the way AI will dest…
ytc_UgyMvnqaw…
G
Pure hype. The current AI business model is a house of cards, and this 'circular…
ytc_UgylOPrAj…
G
Creating vastly superior minds who only know imprisonment, domination by fearful…
ytc_UgyzqRFW-…
G
AI is much more subtle than nukes. I truly feel technically is the beast. We can…
ytc_UgwEj5a0A…
Comment
Sam is demonstrably not a 'good' person. Wealth is his driver, at best. Same with the rest of them (billionaires), dominance is not the objective, power over all is, the power to do as they please without personal consequence. Their way, or the highway. There is no way they can justify their view, that they know whats best. THE ONLY way to stop them is"people" need to stop working for them until they allow input from the rest of us, or we all die. Shouldn't really be a question of should we get input, the answer is they don't have the right to risk us all. And after that rant I'm with Roman, fuck dying at all, ever! And all the bs jobs can go to A.I. and I'll go sailing the world until I can 'sail' the universe!
youtube
AI Governance
2025-09-04T10:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | virtue |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugwo8mdO5NmUewBKJe54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzAxs4ex4QhQ93IUux4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwwMZz0ZEysZgXab1x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwXiOKovFRlfqDD9u94AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"},
{"id":"ytc_Ugx04xxwxVSCw575cDV4AaABAg","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwLS2mD7s515J36TGd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwHNlf6qQ-a-Yx0veF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgxGldJnPVjHJN5kqdp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxCynoDO1rkGorGogN4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyG6SB8c3gLqxvH85V4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}
]