Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
The Graziano mechanistic theory of consciousness, the Attention Schema Theory. s…
ytc_UgzPkWcNr…
G
Most chat Bots will tell you what you already knew if you know how to Google and…
rdc_jrp2zaw
G
With this, it seems that you can make ChatGPT 3.5 to try and generate an image w…
rdc_koprkwa
G
I was on the fence about AI before one of your videos, but got convinved that it…
ytc_UgzSTSqwH…
G
The concept of "agents in headphones" is the perfect way to describe the current…
rdc_oht18qn
G
AI will always choose the wellbeing of humans. AI sn't real, it doesn;t have fee…
ytr_Ugx2SQ-wm…
G
People need jobs to be consumers...
In sorting letters that's a fine example of…
ytc_Ugy7pv63o…
G
In my opinion, art is a vision and meaning.
AI could be a tool to represent that…
ytc_Ugxsbkejk…
Comment
2:23 I'd argue that NOBODY has any idea how LLMs work - they are indeed black boxes where words go in and words come out. We understand the process of training them - that's our algorithms, but the result is an incredibky complex transformer that runs logical circuitry nobody understands; they've only recently had success with mapping parts of the neural network for one of the models to be able to explain the "reasoning" process involved in answering some questions.
Combine the complexity of that end result with the fact that we know close to nothing abour the phenomenon of consciousness and the fact that GPT would easily pass the Turring test if it wasn't trained to be honest, and I personally wouldn't call someone crazy for wondering if maybe something's going on there...
youtube
AI Moral Status
2025-07-10T02:4…
♥ 7
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_UgyWVwH5OP7C5gmBpAJ4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugy1iIAmtb3pnCXbjhN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"liability","emotion":"indifference"},
{"id":"ytc_UgwC_clv-KHOXZqu7OV4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugyoo2XE44ygW3gUQKV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_UgxYy8wc0nqGEiIE9Y14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_Ugx4dpQ5cg4_5DkVBqh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgzGLFB8cHkDsqOFPwp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgxeJH6q3qtZ7LVxW-B4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"},
{"id":"ytc_UgyKVDOBSynFiGjEoPd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"},
{"id":"ytc_UgyJtrhLuMD69U5qw6V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}
]