Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Thank you for such an extraordinary interview. I'm not afraid at all; I see everything as an opportunity, and in this case, a reaffirmation of what I've experienced in my life since I started Vipassana meditation. It has allowed me to observe the impermanence that reveals the simulation or illusion in which human beings live trapped. I suggest you interview Yuval Noah Harari. He is an Israeli historian, philosopher, and author, born in 1976. He is best known for his books *Sapiens: A Brief History of Humankind* and Homo Deus: A Brief History of Tomorrow explores the future of humanity, positing that as we overcome historical challenges like famine, disease, and war, our new goals will be happiness, immortality, and god-like powers. The book examines how technological advancements, particularly in biotechnology and artificial intelligence (AI), could lead to a new form of human or even render Homo sapiens obsolete, replaced by new entities or a more powerful, upgraded version of ourselves. Ultimately, the book questions where humanity is headed and how we will manage the immense power we are gaining.  Thanks a lot Miguel
youtube AI Governance 2025-11-11T20:1…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policynone
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugx3QMz3HszogANDzFR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgzPFY3oVP27dYKhdGF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyeyXY_OZ4K7g_0AQN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyWsAF3eKpjq34CQ2B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugxh0XZAyIyKB2hiiQ14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgzyNaumB1bWFAsXc-J4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzosjqyMQlTpegMjN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyTZsI2whXr4ooW-b14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgxgMIoY-_iTDi9BsMR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgwYi6Yxdsslf4d14ql4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"} ]