Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Honestly, I was skeptical at first. I had zero tech experience, no money coming …
ytc_UgxsoeTFZ…
G
Ok when humans start to get pissed off at robots and start attacking. Do you rea…
ytc_UgzmewM_2…
G
"ai artist" okay buddy lets rephrase that. "ai absuer" sounds much more like it …
ytc_UgykjWcFx…
G
This is a bit confusing as Ray Kurzweil is a head of engineering at Google and h…
ytc_UgwDtOtTD…
G
To the AI creators, you'll be OK if I use your software to train the next genera…
ytc_UgzEqw3Wd…
G
Ai Art is something great, for making all kinds of artworks still the individual…
ytc_UgweIayge…
G
This is the first level take I have heard and it reflects my own experience. Due…
ytc_UgzP3btwe…
G
I'm really glad to have the option to do either, especially with actual digital …
ytr_UgwPnYmsY…
Comment
For me the reason is even more basic. The computers the LLMs runs on are, despite the extra addition of special Nvidia chips, just regular computers. And the "neural network" of the AI 'programming' is all a "virtual machine", layered upon ordinary computer chip computation. They aren't making fundamentally different computers to create these things, so unless we are willing to call our ordinary computers conscious there is yet no reason to call LLMs conscious either.
youtube
AI Moral Status
2025-10-31T07:2…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | none |
| Reasoning | unclear |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytr_Ugyu6z4Pp0svDkQdioV4AaABAg.AOvWlkghdIeAOwHPKKoVXh","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytr_UgzuZRURQSeeS-QzHsR4AaABAg.AOvWYTnzRcKAOwAgj_NNJj","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_UgxT7RhFToA3B5KS5el4AaABAg.AOvVT1lAWuUAOvX28fpa8B","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytr_Ugxm7-V2cw080X9sQZx4AaABAg.AOvVHzWnHuTAOwJ3QLWO2U","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOvlKIM-c07","responsibility":"ai_itself","reasoning":"virtue","policy":"unclear","emotion":"fear"},
{"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOwEBayQVOR","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytr_UgyCfMdD9BZ9eMYKsqd4AaABAg.AOvUflNyaVbAOwFlVGWy1q","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"indifference"},
{"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOw5yAMjmCo","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOw9OWiySM3","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"mixed"},
{"id":"ytr_Ugw-S1nEvQFHU322zGt4AaABAg.AOvSWEDCLLeAOwB90dnOe1","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}
]