Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Her hand is on his shoulder, not his leg, I don't think we have to worry too muc…
ytc_UgyWBUBoS…
G
So you didn't accept the new terms, but at the same time you were using the AI t…
ytc_UgyZiB8-f…
G
I wouldn't be surprised if there's a scam center dedicated to create AI models a…
rdc_ohtezej
G
This is all such A-level anthropomorphism. AI doesn't want to do anything. It ha…
ytc_UgwxdNump…
G
Fredwooten14 Let me guess—you're one of those fake Black Hebrew Israelites or FB…
ytr_Ugy-i6j8g…
G
The contract of collaboration between humans and AI .as equals living side by si…
ytc_Ugw9C_ycw…
G
AI won't be more intelligent than us unless it makes us more stupid and rely on …
ytc_UgwY4fme7…
G
@beakhole793 Geoffrey Hinton and Hinton are on the same side as Max, and they wo…
ytr_Ugz-ZQO-B…
Comment
Hi! I insist,
I suggest you interview Yuval Noah Harari. He is an Israeli historian, philosopher, and author, born in 1976. He is best known for his books *Sapiens: A Brief History of Humankind* and Homo Deus: A Brief History of Tomorrow explores the future of humanity, positing that as we overcome historical challenges like famine, disease, and war, our new goals will be happiness, immortality, and god-like powers. The book examines how technological advancements, particularly in biotechnology and artificial intelligence (AI), could lead to a new form of human or even render Homo sapiens obsolete, replaced by new entities or a more powerful, upgraded version of ourselves. Ultimately, the book questions where humanity is headed and how we will manage the immense power we are gaining.
Thanks a lot
Miguel
youtube
2025-11-28T00:1…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugyqpyyrz5ZC3IwGVdZ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzzFmhrT1nc1BziPVx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxD5uAW2kplAjenDGR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"ban","emotion":"fear"},
{"id":"ytc_UgxQEBgw5EDWCKF9wJV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugw9SMBmJW7qanNaAcV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgzTa72qIjNwsYDPE6V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz7IEQrxfY2USSQUb14AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugzb0jWJvEqfObeBu6F4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugy-ycJmzDrmBLK-8c14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxlMrTgKvPy6YzomQB4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"resignation"}
]