Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Haha, it's all in good fun! Rest assured, our AI models are here to assist and s…
ytr_Ugy5qpYSm…
G
I wonder how the people who own a given AI, knowing that once it crosses a certa…
ytc_UgyIKjMw9…
G
Every time I see something that looks too perfect, I check it on Undetectable AI…
ytc_UgwM07dc6…
G
I don't believe she sent anything, its A.I Images, someone just took her face an…
ytr_UgyVzmPJ9…
G
If the advent of driverless trucks will convince our idiotic politicians to do s…
ytc_Ugifh-3uC…
G
Thanks for caring sir but unfortunately our government has other plans for AI th…
ytc_Ugwfg-a5C…
G
Some of the same arguments are being made about AI that they said about photogra…
ytr_Ugx4Hil2Q…
G
Ai "artists" arn't artists (i am an actual pixel artist and i'm glad the ai fuck…
ytc_UgxXq1VEE…
Comment
As a materialist too I don't believe in consciousness so I would say some part of my brain is firing neuron and that firing of neuron make me want something and consciousness or the impression of it is the best thing evolution found to make us survive through difficult time so the flexibility of our brain
And that's my big counterpoint to what is saying because right now AI is:
input(lots of them) -> layer({F(in){in(i) * weight(i)}) -> layer()....layer()-> out(lots of them)
so there isn't any self reference/recursion or anything it's just taking input applying the function and returning output on top of that it's not learning really it's static (to train it you need to apply a other simple function but backward from out to in and you have to know the expected out)
for me the impression of "consciousness" and the ability to learn directly from our brain come from the fact that our brain constantly reference itself in a cyclic manner and that's not something that AI has been conceive for as doing that will take infinite time before reaching the out node same for the learning process (due to going in a loop) for us there is dying off of signals and the output is kinda everywhere
for us neuron seem to reinforce there connections independently from each others as electrochemical reaction pass through a connection in that sense current AI model look like one BIG neuron.
so the "consciousness" or emotion are really not there at all it's just a completely static sequence of matrix vectors multiplication
youtube
AI Governance
2025-08-15T14:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | none |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:59.937377 |
Raw LLM Response
[
{"id":"ytc_Ugz7Jx29YCCFsAVZwwJ4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"mixed"},
{"id":"ytc_Ugy2aKAwFdBX03uF9TV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwfBV6Z20mhY4nbSLd4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgxmbAiK346ZYBoL4a14AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugx_mzztV4E3kPpdIIV4AaABAg","responsibility":"company","reasoning":"deontological","policy":"industry_self","emotion":"resignation"},
{"id":"ytc_UgxNA8KhDXojt6be5M54AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwO1BWuogsYBNLCTFV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwvWKwjvVDyxuIY9Z94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgxZ3XK9McoJXMSpwud4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugwr8B-dzLtIgGy8i-14AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}
]