Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Learn there's NO expectation no perfection art IS messy art IS hard art can let …
ytc_UgzYMJZxT…
G
Autonomous weapons are pretty bad as computers and AI are terrible at making dec…
ytc_UgyUn52dd…
G
the amount of times i had to tell the AI the solution its trying to come up with…
rdc_mrrnkny
G
Yes! Art is all about emotion and the stories those emotions provide, and AI can…
ytc_UgyKeRHEV…
G
I like these videos but I'm disappointed at this one. I was looking for insights…
ytc_UgyWPE-a3…
G
okay, ai can do it “faster and better”, but it’s about the artist expressing the…
ytc_Ugw3Q-JBY…
G
Ai isn't completely bad, as the original use of it was to help humans create stu…
ytr_Ugz-1m-BT…
G
As a human who identifies as a robot ... I kindly agree to disagree that is thi…
ytc_UgxfDnPM8…
Comment
About half of all published AI researchers say there is a significant risk of human extinction from superintelligent AI. Almost everyone working at the leading labs believes this. The two most cited scientists in the world are called the "godfathers of AI," and they believe this and are extremely worried, touring the world begging for regulation to ban the creation of superintelligence, right alongside the author of the standard textbook on AI. Whistleblowers and former safety researchers at OpenAI also have a lot to say about this, and literally quit their jobs and in some cases risked the majority of their net worth in order to warn humanity about what they know.
Literally most of the world's top experts agree with most of what Yudkowsky is saying about the problem. Many of them just still haven't caught up to all of the reasons why we already know that their brilliant solutions definitely won't work, because researchers have already been down those roads.
I myself spent a few years ingesting all the arguments and empirical data myself, and they are just obviously correct.
The problem is not a few cranks. If you rank order all of the people in the world by how credible they are on this topic, by any reasonable measure you want to choose, the people at the top of the list are more concerned than the people at the bottom of the list. And a study ("Why do Experts Disagree on Existential Risk and P(doom)? A Survey of Al Experts") showed that the AI researchers who are ignorant of basic AI Safety concepts are less worried. In other words, being more knowledgeable about this topic usually makes you more worried about it. That on its own should make you more worried!
youtube
AI Moral Status
2025-10-30T23:0…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | regulate |
| Emotion | fear |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UgzCQ-iUBGiJbotDjLl4AaABAg.AOvG2nqLvwuAOwENDmBfQh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgyydrhSRJE9cP4zynx4AaABAg.AOvFV0kEEVsAOvSebWOPx5","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyDuEBwo9z5ChmnA9N4AaABAg.AOvF8iNnKddAOvOLsudQZA","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytr_UgyDuEBwo9z5ChmnA9N4AaABAg.AOvF8iNnKddAOwoZIRd89p","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyDuEBwo9z5ChmnA9N4AaABAg.AOvF8iNnKddAOwqnD4BTpC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyAT_ritWLYoa8ig-B4AaABAg.AOvEqiPULPiAOw0-2ulpDw","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugz-P8VsXJjKSXwLjHN4AaABAg.AOvDxomYDG_AOvJ1C4UJFH","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytr_UgxSjbbmRsPdPz8DLRV4AaABAg.AOvDXVlfGR6AOvGjVKqzhz","responsibility":"company","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytr_UgzrmdAGaBxHu3fE2od4AaABAg.AOvDU5fjZeZAOxh6dRK6iu","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytr_UgzmiJxClhPU4ivMYwp4AaABAg.AOvCsR6OqKfAOvsD1rGUcF","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}
]