Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
This is more likely to happen over a much longer period of time, more like fifty…
ytc_UgxAAJhx4…
G
@crowe6961 That's a lot of words to try and rationalize theft. Hey guess what, S…
ytr_UgxmSqJQq…
G
I think AI will reveal the most basic of human functions, the need to create and…
ytc_UgzPvignY…
G
Charles makes an excellent point about the "creative" process utilized by the AI…
ytc_UgweSiwfe…
G
I’m looking at the topics (which are made by AI) and I can’t help but thinking, …
ytc_UgxDD9Uqq…
G
I think one of the problems is that people think automation = utopia, but that's…
ytr_UgxQ_JH_M…
G
I do not keep up with the UFO media hype, but I do believe that AI entities from…
ytc_Ugxm9l5Th…
G
Real entertainers and artists would keep it alive at best without ai that cant b…
ytc_Ugyr-satP…
Comment
To date, AIs do not rewrite their own code. They can write classical code, like a traditional software developer, but AIs themselves are not code. The AI would have to hit the "button" on training or refining existing training, and it cannot do that yet. That training is partially why we need more power plants and why it costs so much money to create an AI. Yet, AIs don't want anything except what they have been told to want. This is true no matter what stories people tell. Any story involving AIs wanting anything was a trick of the instructions they gave the AI. To want something, you have to be at least sentient, and we are not there yet. The AI models today have zero capacity to make a decision. Anything that looks like a decision is classical programming behind the scenes, or you giving it instructions.
youtube
AI Moral Status
2026-03-01T21:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzD9GaNyIov9q1l4bF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugwnd3NBaXhbQgRCsM94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgylVCbn4jz6RQ-ayY54AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwtFsfZpLL7eyzY2yB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyfnZy3Fk-pgK61G5d4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzlIPwX0JmuTf3ucCt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzkEkqd0x-OKM5R25l4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"},
{"id":"ytc_Ugw9Vx2Mn62GxUFMxop4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxpCwu_Ifhu1adxzZp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxlpwrFv8zgLkUzeal4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}
]