Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If you guys thought smartphones and social media ruined humanity and society...j…
ytc_UgwEu2z0p…
G
I genuinely think using AI for anything creative, be that painting, writing or p…
ytc_UgwRlqTpq…
G
No...imagine we all are ai and some other people are looking at us and we're at …
ytc_Ugwj_jlgz…
G
Consciousness is in all things ?
AI is programmed in the same way that we are
I…
ytc_Ugy0ygaGv…
G
But also, in the end, when we have ASI (assuming possible),
they won't need the…
ytr_UgwJZEAd7…
G
That's funny your comments about climate change. After talking to Gemini for a w…
ytc_UgzZOu2zB…
G
The thing that worries me is not losing jobs, its that we're losing jobs with th…
ytc_UgzTcWf2k…
G
Devil's Advocate: There are plenty of well known comic book artists who use Ai, …
ytc_UgzRSgwfk…
Comment
For millions, daily life is an existential battle - for survival, yes, but also for dignity and autonomy. In that world, self-actualization is a privilege. If AI can reduce even a share of these burdens, it can meaningfully improve lives, provided AI for Good becomes reality, not rhetoric.
At the same time, it can be profoundly risky, too. Something this interview sketches with brilliant clarity. Even now, I find myself hoping for a Nobel Prize for safe AI; and chances are it won’t be the only one to become a hybrid prize in the future. I can hardly think of anyone anymore who isn’t working with AI in one form or another, if available.
youtube
AI Governance
2026-02-05T13:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | approval |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_UgxFEVppC0bD64JrIbF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgxNQixGdWVVWmRlzcJ4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugz7w5PlsiYnj1W3-9h4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyXG9ZTr2WkvOP-iv14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"approval"},
{"id":"ytc_UgzgCGot3WqygDQunql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxkQ6kEajaSKLI7mx54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxQ8n-uFz6MUyOiol94AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyHDwXddgvz_r4sfp14AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgzH3sZGi5T2cSxO8a54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_Ugz7Cts11Q39wuZ4z8d4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}
]