Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I have the solution for combating AI. Go over to the wall and unplug it. As for …
ytc_UgyqYuaUt…
G
Something I don't get and seems kinda contradictory to me from Roman's POV: Why …
ytc_UgxHu_EWM…
G
I think we're dipping into dis illusion already. Because all we're getting this …
ytc_Ugy39yyrz…
G
Now thats neat: http://bi.gazeta.pl/im/18/84/ec/z15500312AA.jpg "Pro-goverment f…
rdc_cfkw1q7
G
You’re making a very clear and accurate observation about the disconnect in Alex…
ytc_UgytyHk26…
G
This type of disinformation is also on TV as well except its more well crafted t…
ytc_UgwdVoUaw…
G
Are Raw Unconstrained AI models being used to run an 'artificial telepathy' we…
ytc_UgyCjz02s…
G
He'll be kicked out of the cult if he even suggests that the economy is bad unde…
rdc_m80opwn
Comment
We each have our own experience of being self-aware, of experiencing colors, that others that are (as we observe) of the same biological species as ourselves report a similar experience of self. And so we thus can formulate there is such thing as consciousness - something that is beyond mere cognition processes.
But we can never make any assumption that a machine is having a similar experience to consciousness to that of our own because it clearly is not of the same biological species as ourselves. And Cognition engines can be crafted to mimic our behaviors as conscious beings from an external observer perspective, and so an external observation of AI cognition machines is not a sufficient means to make any determination as to whether said AI machines are conscious.
What is a fundamental problem is that we don't scientifically understand our own consciousness and therefore we can't ascribe any mechanism by which an AI cognition machine could become actually self-aware and experience, say, the color blue. And we have not basis by which to believe that consciousness somehow inevitably magically emerges from AI cognition mechanisms. Making such systems faster or adding more capacity doesn't in anyway move the needle toward generating a conscious experience for the cognition apparatus.
It's really weird how there are lots of people (regardless of education level, etc.) that confuse (and therefore conflate) consciousness with cognition processes - whether biological cognition apparatus (e.g., neurons) or inorganic cognition apparatus.
youtube
AI Moral Status
2023-08-20T20:4…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-26T23:09:12.988011 |
Raw LLM Response
[{"id":"ytc_UgxXf6iLXYs_qw5P_wZ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgzA_EDlii9sucOkzdl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwbvqOhhvLK7pScewZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgytREZkZ7INrfeprkl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyb5CPxaTjvGdDYfKp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwexOjh75WBwUfiWqZ4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyH4NAUgAHJvBWl-JF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxKNeOJeVJSEYVp3mV4AaABAg","responsibility":"user","reasoning":"mixed","policy":"none","emotion":"fear"},
{"id":"ytc_UgyY7NELlM0CYbD8uz14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"},
{"id":"ytc_UgwY7rKs5q2vGxdkYnd4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}]