Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Calling oneself an "ai artist" is such an oxymoron! If you think you're an ai aR…
ytc_UgyXo3q2z…
G
Here she is again, failing the nuance and going straight for the fearmongering. …
ytc_UgwQWYqQN…
G
Ultimately none of our opinions matter with regards to this topic.
We aren't t…
rdc_oh1kvda
G
Commenting to algorithmically support your noble fight, and also to say that the…
ytc_UgwsmzPir…
G
computer science (CS) is not strictly learning a language or framework or creati…
ytr_UgwXQtRZH…
G
Use claude code and learn how to use it correctly and it will one shot this kind…
ytc_UgzgCnNNG…
G
We have the opportunity to end this, together, right now. But we won't because …
ytc_UgwKQd_7D…
G
ChatGPT can be useful at finding valid sources, but the catch is you need to ask…
ytc_Ugy9EmS3H…
Comment
Awareness of oneself doesn't necessarily produce consciousness. A computer can be aware of itself by checking its internal state, but we don't know if that produces consciousness because we aren't computers. I think your guess is a good one, but it's just a guess.
I disagree that free will doesn't exist. It's obviously not free will when a doctor checks my reflexes and my leg swings involuntarily, but I think you mean complete lack of free will. My brain produces my thoughts and actions, and part of my brain is me. So, even if essentially deterministic processes outside of my control lead to my thoughts and actions, I still have free will because I can do what I decide to do (so long as reflexes and instinct or whatnot doesn't get in the way).
I wasn't arguing that Searle's argument is right. I was arguing that it isn't moronic because we know so little. Intention could be required for consciousness depending on what consciousness is. For example, consciousness may be composed of thoughts, and thoughts might always have purpose (curiosity, problem solving, etc.) Thoughts might be different from mere perception and involve attention, imagination, and/or unknown things, thereby distinguishing us from the Chinese room if it doesn't have those properties.
Weak AI already surpasses its hard-wired programming. Neural nets can learn patterns on their own, for example.
youtube
2016-08-11T01:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_UggM-62O18J5XHgCoAEC.8HPzS7kktwF8HQI1_-2meb","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UggM-62O18J5XHgCoAEC.8HPzS7kktwF8HQLl3--9Fo","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UggM-62O18J5XHgCoAEC.8HPzS7kktwF8HSEPVUKG__","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UghlVHdKSsFDl3gCoAEC.8HPTCQhSvuu8HPpuAU9UhR","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UghlVHdKSsFDl3gCoAEC.8HPTCQhSvuu8HRyAoLyJjN","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugh_eqMzofsL5ngCoAEC.8HOAX7pq1HW8HOAk08tvFH","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugjp4eawKYfkSngCoAEC.8HMsrQ1U-oG8HTfbJSgY9B","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_Ugh42eOQdjXgzXgCoAEC.8HMbaVIIfk591D6Q8AqgH1","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UghOoR5pLNkFAXgCoAEC.8HMIXY65na68HREeeFqIse","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytr_UgjS-A6UYU5fpXgCoAEC.8HLoOJ-l-5t8HMPpEK-y44","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}
]