Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI is expensive to train, not run. If you have a consumer level graphics card (3…
rdc_n9hooun
G
That new ai Jay z rap album even had soul.
It’s naive to think ai will not hav…
ytr_Ugzim0Xfv…
G
Hi! I'm a fresh graduate from art school, and I'm just really curious about ArtS…
ytc_Ugwwb5zkC…
G
How smart are any of you that believe in artificial intelligence, doing anything…
ytc_UgyhFQFUY…
G
I don’t know, how many animals have been killers by white big game hunters tryin…
rdc_dv6kmkt
G
Gen Beta definitely do not want to call themselves Beta, so they will likely be …
ytr_UgxNjiHUi…
G
I wouldn't be irritated with ai bros as much at all-- if they didn't keep saying…
ytc_UgyXGb3CX…
G
No, the issue is the plain english will need to always have a translation layer,…
ytc_UgznTmolZ…
Comment
1:26:48 - I agree with everything there, except "..the chat bot just HAD a subjective experience". It didn't! Not in the any way comparable to humans. It didn't have ANY experience at all. It's an LLM with inputs and outputs. It used language that was accurate based on it's model, the tokenisation of inputs and the statistical result of it's training, to deduce what was actually happening and to relay that. I will never understand why Geoffrey Hinton, a very clever man, is barking up this tree of LLM's having actual experience of any kind. It's purely a 100% deterministic computer model doing exactly what it was designed to do. It's simulating based on training data. Stop this nonsense, please. He might as well be saying computers have always had subjective experience.
youtube
AI Moral Status
2026-03-15T12:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | consequentialist |
| Policy | none |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgzgXJ633_EPqpawrBd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_UgwweU-x-UeiBnQ_jzJ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgxNVS5RMabpa9iYx114AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_Ugw_m1v4qo560OhY6Tt4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"fear"},
{"id":"ytc_UgwxGg_rVYc-kYam_m94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwExdPc74YAgEcskjp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugzbht3Id1G341J6DrN4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugy2xNBYxfSRS_caGcF4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugz2IxaVhxMekEkrVB14AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgyXgYeaZ-fpaeuy60t4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}
]