Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The thing is AI can be autonomous while all the inventions before AI in this wor…
ytc_UgydF81i3…
G
This is a load of balls. AI is controlled by humans and limits can be set by hum…
ytc_UgwH-v3m2…
G
Interesting that AI was used to create most of the images for the ECHO part of t…
ytc_UgwjjFh3I…
G
Thanks for your comment! It will be interesting to see how each country approach…
ytr_UgxBbCJpU…
G
if you want to be prepared to give AI rights how about we give animals rights fi…
ytc_UghMInwGG…
G
Women are already marginalised by AI. Just listen to the AI narrators for a sta…
ytc_Ugy35GzvJ…
G
Until the Robot goes to the fridge and actually pours the drink itself and bring…
ytc_UgyOILAZn…
G
Research SORA. It’s affiliated with OpenAI which was co-founded by the real rac…
ytc_UgwaXMF2N…
Comment
I suggest you interview Yuval Noah Harari. He is an Israeli historian, philosopher, and author, born in 1976. He is best known for his books *Sapiens: A Brief History of Humankind* and Homo Deus: A Brief History of Tomorrow explores the future of humanity, positing that as we overcome historical challenges like famine, disease, and war, our new goals will be happiness, immortality, and god-like powers. The book examines how technological advancements, particularly in biotechnology and artificial intelligence (AI), could lead to a new form of human or even render Homo sapiens obsolete, replaced by new entities or a more powerful, upgraded version of ourselves. Ultimately, the book questions where humanity is headed and how we will manage the immense power we are gaining.
Thanks a lot
Miguel
youtube
AI Governance
2025-11-21T16:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugyfhkjw8aNu-lSeUdl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhQEk9nqvgxhYxPK54AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwVC9xxShioD_ByNud4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxI8pY_qHMU9i22-5B4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugyy1A2czmMnQ2w8aX94AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxteEoFb5Wk1765n0V4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwU1LUel5UmyHPRwBt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzTdfKYO2JVgb1rPUN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgySapaPoFFF3h-cadd4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgymcvrnCH5jzoMCVt54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}
]