Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Fake af lol. Why would you want that opposed to just automated those functions a…
ytc_UgzmLaN97…
G
Ouf... meme les analystes de donnees vont y passer... o1 et o3 vont remplacer to…
ytc_UgysEHnO0…
G
it’s HILARIOUS that the dude advocating for AI “art” said: “you can’t expect muc…
ytc_UgxEeH1ef…
G
or or, you could always poison the metadata in most image extensions (.png, .jpg…
ytc_Ugz_jPvlD…
G
Unless we can envision ourselves as a singular humanity on this planet, the end …
ytc_UgxtX0Wzh…
G
AI is the world's most expensive parrot. It can only spit out parts of what othe…
ytc_Ugx8QQ3i9…
G
Nice video! Well laid dout facts there too. I'm thinking of venturing into imagi…
ytc_UgwRaqRVV…
G
I am so saddened after listening to this podcast and I’m not even done yet. The …
ytc_Ugyy6idJ5…
Comment
If AI does all work why do humans need to learn? That's a prescription for widespread ignorance about everything. The only way future humans will have knowledge is if it is implanted with the human body. Instead of going to school humans would be implanted with devices that interact with the brain and allow us to access knowledge on demand. And that will be a new species of human. Humanity become super intelligent cyborgs. And of course it would come with it's own risks. But without a motive to struggle through the process of schooling there will be no other way to make humans knowledgeable. Otherwise we become as stupid as the humans found in HG Welles "The Time Machine" or Aldous Huxley’s "Brave New World".
youtube
2026-03-28T21:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgytfYErCKg_D6tylVl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyAtPag_Vw-PscL-Cp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgyM9KQlvTZejeRx1Hx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgxRCQX5zacZXUSY7st4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_Ugw6I8YYKyEAtpJymep4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzflyDTt2M6ws6qSFl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxLaNx08Q8xADlxOxJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzSGq6HOeMmb0C3MB94AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyGYh6IFWP7xUQ9Xm14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"fear"},
{"id":"ytc_Ugysu84epWp-utI89zd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]