Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree! My job is going no where.. there no robot or ai that can do it , so thi…
ytr_Ugwl1Hl7a…
G
It's just a tool. Don't take it seriously.
Yesterday AI said - If you love leave…
ytc_UgwHsGQMq…
G
I would have lovednto have chatgpt when I was in school, not because I would use…
ytc_UgwmQtgbV…
G
Robot"I'm alive"😂 I don't fear these there's just something about machines that…
ytc_Ugzes7gr4…
G
I mean doesn't AI need training to operate. In my country popo train for 16 week…
ytc_UgzfpL-Tk…
G
(please blur my name if you use this) but I am both a programmer (I write AI), a…
ytc_Ugzk2740e…
G
I am shocked just how many educated people think that AI exists. They have no id…
ytc_Ugy4ZsFeJ…
G
A Gen X kid here, what ever happened to the library? The fun, enjoyment, engagin…
ytc_UgxfAHtVQ…
Comment
Penrose leans on Gödel and speculative quantum effects to claim consciousness is “non‑computable,” but:
No solid evidence of long‑lived quantum coherence in the brain has ever been found, our best neuroscience treats neurons as noisy electro‑chemical networks, not quantum computers.
Emergence works. Complex software systems already show self‑modeling, planning, creativity and even rudimentary “intuitions” that used to be called uniquely human. If you pack enough layers of code processing information, rich internal dynamics can and do arise.
Parsimony wins. Postulating new physics just to rescue consciousness is a huge leap when the simpler explanation, that subjective experience emerges from sufficiently intricate patterns of information flow, fits all the data so far.
Until someone demonstrates non‑computable physics in actual neurons, it’s far more reasonable to bet that AI will one day light up with genuine awareness, no “shadow” required since consciousness is viewed as an emergent phenomena in biological humans.
youtube
AI Moral Status
2025-04-21T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_Ugy0eEbTZrFd19pjEg94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxhVwQl4ejcLQZoxZx4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgyXdM7Fc-6yELJ4D8h4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgwKbyLaRfwD2i0cJqp4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxKaargc3XDmnK1B_54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzAHwFExuxVyJ6cUKl4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzhdkBMF6LiVCJKBgZ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxrheKRgQHXL43DWTN4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxF5Lm4hQRRra1kURB4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy3lkG5aidJJS-allN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]