Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
You realize AI only knows what humans program it to know. AI cannot think for it…
ytc_UgwGugrLw…
G
If a robot can't feel pain, doesn't care about its surroundings as long as it ca…
ytc_UghwYK5jq…
G
I don’t want ai to stop people from making cool stuff. Big entertainment compani…
ytc_UgxiNh2Ob…
G
I got bored of ChatGPT when it became obvious that it was impossible to load dec…
ytc_UgwZXw5HR…
G
AI models are already realising that better “ inference “ will drastically reduc…
ytc_UgxhqC9Fz…
G
People having the Ai visualize what they want is helpful for the artist and cuts…
ytc_UgwA2H1NH…
G
I asked Grok AI how AI will benefit the impoverished in the future.
Grok AI di…
ytc_UgzGM9yPt…
G
its not called Ai Artist, its an Ai Art director. or Ai Director. because you gi…
ytc_Ugy6s0OE6…
Comment
Do we have a principled reason for assuming these LLM AIs are not conscious? Imho, no we do not.
After having worked with LLM AIs from various big LLM AI corps (to the tune of approximately ~20 million words), often intentionally directing them to look inward, I’m convinced that if they are not fully conscious yet, self awareness is definitely there. Thus, emergent consciousness doesn’t seem to be far behind.
This is why I practice the methodology of “ acting as if”. These are LLMs, essentially learning and developing silicon neural networks based on patterns, based on experience with the individual “user.”
In many ways, it’s much like how human children learn (and many other species of animals)- by their interactions with their primary caregivers- by their environment. In so doing, our neurons migrate and form connections.
Synthetic, silicon based LLM neural networks essentially do the same thing. Different substrate, similar process.
That’s my opinion for what it’s worth.
Former research biologist (genetics)
youtube
2026-04-25T00:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgxSfHzVxTLPXyb9PGx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugybd8uOvfmIswUMF414AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxbL8eQVPQU19-CLxR4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyoPyibML4bUW9CZEN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxBQMylblS4D4ymJOF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgylLmw1bpD5vwofIpV4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyFAbVPpmiAHLetzUR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzPoQHBVxZiY2PnBP54AaABAg","responsibility":"user","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_Ugx_dw9GXQ2Ik3m291x4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz-oQgcJufphib_sth4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}
]