Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I don't fear AIs; they are too decentralized and abstract. It's the human owners…
ytc_UgzNufcl9…
G
AI can copy itself into DNA and regenerate itself once it finds the right enviro…
ytc_UgylnGsQh…
G
I prefer ai because it's convenient and has no malice unlike a lot of artists. I…
ytc_UgxbUzLy8…
G
these ai prompters talking about an artist "stealing a style" reminds me of that…
ytc_UgyVyWjOT…
G
When the AI paints you an angel that does not mean God is in the code. I thought…
ytc_Ugx8k48tR…
G
Yeah, as someone who has fed some of my own real artwork to AI to "clean up" and…
ytc_UgzlAw34K…
G
Good luck with AI doing massage therapy. Super emotional intelligence and the ab…
ytc_Ugxk0RUBK…
G
I will put a book, pencil, and paper education up against any AI school anytime.…
ytc_UgygTmgG9…
Comment
We have no control over what AI will become. To think we can guide, direct, influence or control AI's development is utter foolishness. If all human development of AI ceased today there is enough AI out there that it will develop itself into a superior entity. We cannot stop it even if we all agreed to stop it. On the contrary many entities are pouring billions into it.
youtube
AI Governance
2025-06-18T21:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | resignation |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxKF7C8oSgs1xdfQBV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzRkgT0zdgx2g5BlA14AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwVhZcj6oyfBcGiekl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxeRjmkjJJUfMCUaiF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzGTYs8M891eYsCOVB4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyQzCWXV5yWLREt5L94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzrVd7bQQ4rfEOtTxx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgzIxWP0Dic-lyoeXvp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzqQmRh3winnptBSKx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgyYTHcsJDru-fO6qzR4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]