Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
It's amazing how skill you are with the card trick. But good job not trying to …
ytc_UgwHALF74…
G
Self-driving cars i think do have the knowledge to not be driving near huge carg…
ytc_Ugw6WKiqU…
G
They need to go to China to see how self driving car which has amazing sensors a…
ytc_Ugzlz24e8…
G
Genuine question but what about the redraw trends where people collectively deci…
ytr_UgxJPanHZ…
G
@lavenderstarzzz That's what I like to think: AI Art is only and exclusively pop…
ytr_UgyssUvdi…
G
I'm so glad there's pushback against AI-generated slop and people aren't just pa…
ytc_UgzvDFpfq…
G
We are simply cattle, what happens to cattle when it is no longer needed?
The …
ytc_Ugy-4CngP…
G
Hitler destroyed 6 million Jewish persons, this evil has the potential to destro…
ytc_UgyjHkmKo…
Comment
spoken like a true hypercapitalist.
AGI = human replaceable
Goes back to Turing...instead of having a conversation with a machine indistinguishable from humans...having a task performed indistinguishable from the best expert
ultimately this is driven by profit...but what happens when humans no longer benefit? Will they pay? Will there be any profit? To close the loop and still allow Wall St to perpetuate itself, there has to be a new currency that AI models can accumulate and spend; a new currency tailored for AI. And this currency will have a very complicated value, just like the tokens that a model ingests, unreadable by humans, unusable by humans. But AI will provide tools for humans to use to generate this currency. AI of the future will use humans just as humans used computers to mine bitcoin.
youtube
AI Governance
2025-07-28T14:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | company |
| Reasoning | contractualist |
| Policy | regulate |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugy90fRmaOvnRDiN1uZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxL3ftAVz_OhHbBIhB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_Ugxe03nhf68dUe2_3Bp4AaABAg","responsibility":"government","reasoning":"virtue","policy":"none","emotion":"disapproval"},
{"id":"ytc_UgxMdWzUR6vbqjKpeoR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgyyEWyJup8tN6y5OlV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxrwmWp7D4_lJ8LlWN4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyfBVzHVfzZQzhZOUB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzDg7oJqK3NwMQwzV94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgxnWFidCtImgIjzj1N4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugznj90sfuX_fXWIbIJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]