Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
I agree. I'm a data scientist and I vibe code (or is it vibe engineering??) like…
ytr_Ugzv343jV…
G
The ironic part of this is that by going against AI the people showed the right …
ytc_UgyLowZrQ…
G
Here’s the real question, why do you assume UBI’s would never come into place, s…
ytc_UgyZYpeqw…
G
I wish some people I know would understand that
I always hear, "Ah, artificial …
ytc_UgwBswzVT…
G
Ai will never have the passion for self-preservation that living beings have evo…
ytc_UgxlmYKBA…
G
We programmed white supremacy into a large language model, we can see it's irrat…
ytc_UgzcEHIkf…
G
You emotionally dense buddy? Or socially naive? Do you not know what kindness in…
ytr_Ugxuy1qR8…
G
I recently started an NEMT business. As anyone whose started a business knows, t…
ytc_UgxjZfxp3…
Comment
Automatons since 300 BC have fooled people they were alive. LLMs are static and lack an interconnected continual flow of feedback process to continually modify weights and biases which is a minimum requirement of simulating experience. Anesthesia turns off our brain's ability to communicate to itself and adjust our connection structures which is why we do not have experience during anesthesia, essentially rendering our brains static to a degree temporarily. Static structures cannot experience anything. One could argue in the training phase, it would be more likely for this to be possible but the LLM itself is the product of that training as a static structure.
youtube
AI Moral Status
2024-07-26T13:2…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | deontological |
| Policy | unclear |
| Emotion | indifference |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[{"id":"ytc_Ugwi491arsCI9TAP2Sx4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugwee--Qrwc3gaGLV8B4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzMcALFCqCde21OPFd4AaABAg","responsibility":"user","reasoning":"mixed","policy":"unclear","emotion":"approval"},
{"id":"ytc_UgzRDZJ9NMqon1lp2eN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzHx9RsfLDWgkwxAgd4AaABAg","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyV-EGk8aGNHXshe7V4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwqN_k4RdX6YiPUPj94AaABAg","responsibility":"company","reasoning":"contractualist","policy":"liability","emotion":"fear"},
{"id":"ytc_UgxMx5oDEAkAL1d14t94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"},
{"id":"ytc_Ugx1soS7e1REwbVtgRB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgwtFxwkIr908R8Hftl4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"outrage"}]