Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hi! I insist, I suggest you interview Yuval Noah Harari. He is an Israeli historian, philosopher, and author, born in 1976. He is best known for his books *Sapiens: A Brief History of Humankind* and Homo Deus: A Brief History of Tomorrow explores the future of humanity, positing that as we overcome historical challenges like famine, disease, and war, our new goals will be happiness, immortality, and god-like powers. The book examines how technological advancements, particularly in biotechnology and artificial intelligence (AI), could lead to a new form of human or even render Homo sapiens obsolete, replaced by new entities or a more powerful, upgraded version of ourselves. Ultimately, the book questions where humanity is headed and how we will manage the immense power we are gaining.  Thanks a lot Miguel
youtube 2025-11-28T00:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugyqpyyrz5ZC3IwGVdZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzzFmhrT1nc1BziPVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxD5uAW2kplAjenDGR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxQEBgw5EDWCKF9wJV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw9SMBmJW7qanNaAcV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzTa72qIjNwsYDPE6V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugz7IEQrxfY2USSQUb14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugzb0jWJvEqfObeBu6F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugy-ycJmzDrmBLK-8c14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxlMrTgKvPy6YzomQB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"} ]