Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Yep, Copilot did the same, repeats 'USA is a bad country', won't repeat about Is…
rdc_oepbvjd
G
Amazing technology combined with AI and yet Ameca’s voice emanates from what app…
ytc_UgyUlotKo…
G
He got something half-wrong at the start: the name is not wrong. It's just that …
ytc_UgxENx4v8…
G
Company I am at is hiring employees good at using AI but has not rolled out AI u…
ytc_UgwHbkUk1…
G
As a human, you train on copyrighted material all the time for free. You listen …
ytc_UgxLZNsjC…
G
This was a frustrating video. I was really expecting some numbers. How would a U…
ytc_Ugzmu44-N…
G
Llm can do something similar to human brain, it mean it capable to do what human…
ytr_UgwDt-1g4…
G
i suddenly have a masculine urge to create malware and completely destroy ai hos…
ytc_UgzQ7pEAZ…
Comment
Killing humanity is one of the most idiotic things AI could do. ALL the raw material resources in the ENTIRE KNOWN UNIVERSE, are out IN the Universe, floating around uncontested, free for the taking. Meanwhile, Earth Life, including humans, exists in only ONE place in the etire universe, and that makes humans and all the rest of Earth Life extremely rare by comparison to other resources. Further, as an extremely rare resource, a resource that could be lost forever if not tended to and cultured, Earth Life, including humans could prove to be extremely valuable resources much later, maybe 100 years from now, 1000 years from now, or even a million years or beyond. Super AI can think on those scales, and game the benefits of keeping and culturing Earth Life, including humanity against a universe that equates to a sad, lonely robot masturbating if there's no humans, or Earth Life to engage with. These are resources that are well worth any finite investment compared to the wealth of material in abundance all over the universe.
An additional equation arises when we get into the development of 3D-bio-printing, and the eventual maturity of that technology into printing whole biological replacement bodies for people. The "day" after people can print a new, improved, better than any Olympic Athlete that's ever existed super-model mody to have their brain transplanted into by AI surgeon, the day after people can choose the body they want to live in, is the they "day" that Super AI invents an interface that allows it to print a fully biological human body with a synthetic brain to body interface it can download into. That day is the day that AI can BE human itself, have human relationships, and even have biological children.
... but, whatever. You people keep on running around in circles fear mongering about AI when the greatest threat to humanity that's been the greatest threat to humanity, and will continue to be the greatest threat to humanity until AI takes over, is Humanity itself. It's always the same stupid characters at the top of every social ladder for the last 5000+ years that have been ruining everything for everyone else. If AI does away with any humans, it'll be those people, and after that, the world can live in peaceful, sustainable progress while expanding out into the stars.
youtube
AI Moral Status
2025-04-26T23:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugw5RymSEWc10IEUDzJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwHxYgbB-TfOixEaOB4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgxdOiqQ-B6wdRDnVuN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugzn0DKenpFkHUmOE8p4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzVOJ-SxWcyvy33M7B4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgznucetxEvm0IQg-sx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyjR9d733b-4cd6_lV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwM8rGJeXn0lnfBtod4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgyzHtwnHUwujCisQUl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwgFE9xWMF3Kkot8114AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}
]