Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You're not wrong about stopping it, but from discussions on r/singularity, the i…
rdc_kz0t26c
G
This whole ordeal is absolutely wonderful, of course, but the fact that this Sch…
ytc_Ugw8Y_0jj…
G
@cjcawleydesign will go have a look, I only get shorts of you hating on it. The…
ytr_Ugy0tgSmQ…
G
I feel like this is a good topic to talk about as a artist I am a lot more used …
ytc_UgyrQtmJC…
G
The goverment would collpase before it can even get to this point, thats what th…
ytr_Ugz1U8ODa…
G
I debated chatgpt yesterday on the same topic and I didn't have to tell it to be…
ytc_UgxnQUwxk…
G
Is there any way we can print this out and set it directly to what is it calling…
ytc_UgzrDRnP5…
G
AI is just another tool in the box. An extremely powerful tool, but a tool. Thes…
ytc_UgwQsc7sT…
Comment
The idea that LLMs of today are conscious is utterly absurd except in a very obscure panpsychist sense which would be a nothing statement as it would make everything conscious.
The reality of LLMs now is this: at the end of the day, in the lifecycle of an LLM query, an awesome number of parallel matrix operations are dispatched across hardware varying in time and physical location. At the base metal, the operations being performed are the same as all other GPU operations which may as well be a part of rendering a 3D scene. There is only a mathematical operation whose result is reported back and then combined with peer operations. The resulting outputs recurse and the process is repeated.
The last step returns tokens that together appear to coalesce into a single thought but really it is just a huge number of parallel matrix operations finding a single target token and repeating this over and over again.
Could it be conscious? I mean, I guess. But tell me where the hell the consciousness is. I do not see where it could possibly be.
Compare this with a human, where the concept of a pure function is folly and side effects are the name of the game. If all the computer is ever doing is making a pure digital operation, where does the conscious experience live? Vs the chaotic chemical soup of our bodies where making any tiny change has a vast butterfly effect.
Re: Autonomous weapons - in what world do we want autonomous weapons over a treaty with all other nations that autonomous weapons never be developed? The flippancy with which we dismiss the control problem with AI is STAGGERING to me. This was all unthinkable to anyone serious about AI until Silicon Valley invented transformers, quickly productized them, and even more quickly scored lucrative DoD contracts involving their use. Suddenly we go from 2001: A Space Odyssey, Terminator, Blade Runner, etc to FULL STEAM AHEAD give this AI that is already prone to blackmail and violence a MISSILE? Man if this really is the state of the art thinking in ethics and morality, we are absolutely fucked. Sam, you had this so right a decade ago. What happened? Smh.
youtube
AI Moral Status
2026-04-03T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugxhff7zUkGgWaSvjY94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxVBzh62BZY7L9Fb7p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugz3zO_mb0KhFASF-XF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyLLWKpNzBbKwgI_Sp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgzSk3HS_qdWGJjhWXR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"},
{"id":"ytc_UgzXEol_aCvFPj5QMxt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwI5guYDqqw_VgXaw14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"},
{"id":"ytc_UgwE-6nNrg0tnTUKo8F4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwH-dfWFh6F7nzcv6t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy3ZLbge_WUngoFxRV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}]