Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Nah but can u imagine tho. Your not winning bro a robot specifically programmed …
ytc_UgxRoS9Hk…
G
The exploiter segment was very interesting to me, because it explores a common t…
ytc_UgypWnDPg…
G
1. Read better txt who are behind microchip record audio tape
2. Stop fooled…
ytc_UgyhpfMyo…
G
AI can never replace HUMANS - I am 100% Sure. What happened with Deloitte & Aust…
ytc_Ugweh--lP…
G
@neonSeal.mp3Yeah but context is important, they could have referred to AI “art”…
ytr_UgwGm8srW…
G
Boooooo, the day ai replaces me is the day I kns I’m not joking I don’t want to …
ytc_UgxXravOn…
G
Nah, you should fear the AI, because we already aren't controlling it anymore. T…
ytr_Ugy7c_-Fs…
G
This is a big issue for artists who are rightly upset. AI will progressively tak…
ytc_UgzxGk_eN…
Comment
What I got from this is that Eliezer Yudkowsky, has zero clue how natural selection works. The "voice of natural selection" was the most idiotic thing I have ever heard. It is what happens when you don't stay in your lane. Natural selection cannot work with things that are hypothetical or don't exist. It does not have a purpose. It does not have an intent. It does not have a design and it does not have an idea of where "it" is heading. It needs none of those things. In fact this anthropomorphisation of natural selection is the very antithesis of the concept.
Ezra refusing to take that bargain is not him "deviating from the purpose of natural selection". There is no such thing as a deviation from the purpose of natural selection. Natural selection is an expression of what is. You have a variety of strategies being adopted by organisms and the most successful live on through their decendants. This REQUIRES different strategies and it REQUIRES randomness. So Ezra taking some stance while someone else taking another is the fuel that natural selection feeds on. Biological entities aren't "deleted" by deviating from natural selection. They simply compete and the most successful win the less successful don't have as many descendants and eventually do die out. Thus biological entities can never escape natural selection. We were absolutely not shaped by natural selection to think logically about an "optimal replication strategy" and take it. That kind of thing does not exist to begin with. If it was possible for humans to do that we would already be doing it. Using that as a means to explain how the AI might escape the intentions of its creators, is patently absurd.
Even worse the choice of the scenario is trully terrible. Its not a physically plausible event, so a willingness to accept such a deal could not have been selected by natural selection, you are not passing on a survival strategy nor get a reliable way to maintain the material you passed on (if it makes things worse it will be lost eventually) and even the deal it self is bad. Sacrificing u just sacrificed 45/46ths of your genome on the off chance that copying 1/46th to random children might result in greater survival of that bit of your genome sounds like a terrible deal even for the chromosome you did copy.
I am so sick and tired of computer scientists with zero understanding of biological organisms trying to tell us they understand consciousness and sentience, when they regularly demostrate that they dont.
youtube
AI Governance
2025-10-18T20:5…
♥ 2
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | deontological |
| Policy | none |
| Emotion | outrage |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[
{"id":"ytc_Ugz54tRSGTf2WUpK3XB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugwex0cyIxLK0IvzWZl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"},
{"id":"ytc_UgxgtOpmkin3sT4E5bt4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwSNavfO0EFFp1ZIW14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgyVqiB2wQ0iw5-d1794AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwf6yGy9XbDbxjJEUZ4AaABAg","responsibility":"company","reasoning":"deontological","policy":"ban","emotion":"outrage"},
{"id":"ytc_UgzALobPq-0dklCoiS54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwBuWyXbWDR7pMniZt4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwoUIq-U3tZOcbWT7l4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxmPi5LHrKMWrCuytV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]