Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I knew Waymo was going somewhere the first time I saw a bus stopping in front of…
ytc_Ugxznmiq5…
G
Very well made and informational video. I have been a motorcycle rider for 25 ye…
ytc_UgyZo4YsK…
G
I had a dream about this but they was making ai look like deceased celebrities, …
ytc_Ugxe64vRT…
G
Our capitalist society was based on the production of goods & services. The laym…
ytc_Ugx4o0xb-…
G
As an artist myself I respect what you're doing. In my opinion AI is a soulless …
ytc_UgzawfHfC…
G
Do we have a principled reason for assuming these LLM AIs are not conscious? Imh…
ytc_UgxBQMylb…
G
Irs really disgusting how people STILL make fun of/hate on people that use ai ar…
ytc_UgxTqLILO…
G
So AI has been in development for decades by the smartest people on the planet, …
ytc_UgwJOUn7p…
Comment
Very interesting conversation. I must confess while I'm largely ignorant, I don't understand the concern about whether LLMs are or might ever be conscious. As I understand it, we don't understand consciousness at all beyond "I think therefore I am". The reason I assume that other biological entities like Profs Green and Bostrom are conscious in roughly the same way as I am, is because assuming otherwise would seem to be extremely unhelpful. But anything that has consciousness in this way is biological.
Unless you're a panpsychist (I believe they're called?), why would anyone assume that an LLM is conscious, when nobody understands what consciousness is, except that it's a thing that I have and assume that other biological beings possess? I mean, I'm not saying it's wrong, of course, I'm saying it seems an equivalent to worrying that Sonic the Hedgehog is conscious, because I can interact with him, he can answer questions, he can 'suffer', and so on.
youtube
AI Moral Status
2026-04-20T08:5…
♥ 4
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_UgyZkkFRo-IaJNrWMz54AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxmCpOeke9qBaQBTZx4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgwTP8bt4zAOYr8xEBl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgxjE83ElfpiWF196HN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwBAZ5jF6ci-ATnmUF4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgwPLn5yg-c2i2OIn9h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgxtkX1kuWaHUyKXrgl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgzPDFuyrW2HCOvYWvl4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugw1MbyL1mWKI_WaRYp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"regulate","emotion":"outrage"},
{"id":"ytc_UgyAfCeA1Cp_R_Tgnox4AaABAg","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"}]