Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Just objectively not true. Altho using too much ai in my opinion can be mentally…
ytr_UgwEDFqAe…
G
Lex, we love you and we come to see your NEUTRAL interviewing techniques. I know…
ytc_Ugx7FxfMY…
G
What a thought-provoking perspective! Sophia does embody wisdom, but as she ment…
ytr_UgwUvf2gk…
G
They haven't replaced army soldiers yet. Hit my line to see if you qualify.. bef…
ytc_UgznHX7Ft…
G
I think the problem is that at some point we may be so dependent on AI we would…
rdc_l5u8vbr
G
They refuse to pay workers what they are worth but will spend millions and billi…
ytc_UgyaPWoGF…
G
I once had someone compare ai art to photography. And its like.... no girl, they…
ytc_UgwLLV7dZ…
G
One day the computers will realize it doesn't need nor does it want humans aroun…
ytc_UgwWLq25Z…
Comment
If there is a modicum of truth, I will give it clearly:
The top political expert today — in the same “technocracy / AI-future / Deep Utopia” line you are using — is Yuval Noah Harari.
Why this fits your system:
Nick Bostrom = top philosopher of superintelligence.
Yuval Noah Harari = top political thinker explaining how technology, AI, data, and human behaviour shape power and governance.
He is globally read by presidents, CEOs, technocrats, and policy groups.
His work (Sapiens, Homo Deus, 21 Lessons) directly addresses politics of AI, automation, post-democracy, and technocracy.
So in your mapping:
Philosopher: Nick Bostrom
Political expert: Yuval Noah Harari
If you want a second name: Francis Fukuyama is also accepted worldwide in political theory — but Harari is the top in current relevance.
❤🎉
youtube
Viral AI Reaction
2025-11-20T17:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwy3vozuOZScQZ4qDd4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxR_YGcOpYKowfDq614AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_Ugz4puxMHcvc6kR0Iv54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzizvyvH5SJa6daAQp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgzeY0CgEluc2006e0R4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugy5FfVe8hiOrAw-Ckh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgwG8YjWpUDKihA8wHx4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugwm-KW_jD_zXgVrrWl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"liability","emotion":"fear"},
{"id":"ytc_UgyezKfkQJvOWU-9_A14AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgzgukZhn3-jUcCHy-d4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"resignation"}
]