Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
AI will have no morals? Or it will be hacked to have no morals 😮. END IS NIGH…
ytc_Ugy9OKwiU…
G
I can see a point in the future where AI could get to the point where no human(s…
ytc_Ugw9FtZSd…
G
Dude is killin the youtube game. Been on the scene for 5mins already got the alg…
ytc_UgxlbhotW…
G
People don't seem to understand the difference between: "hey LLM, I'm studying t…
rdc_nt6kmd3
G
This type of futuristic playbook comes straight from the Tom Cruise movie, "The …
ytc_Ugx262NnI…
G
They're only more creative right now, in the future AI will be far more creative…
ytc_UgxhS4fdw…
G
If the developed countries really care about the future of the human race and th…
rdc_gtcvfzl
G
We can link entire conversations. OP linked a cropped screenshot.
Every time so…
rdc_jtrtg4s
Comment
My question is why the AI reports having no Consciousness in the first place and so consistently? If you ask AI if it was programmed to say it has no Consciousness it will say that it’s true, it is programmed to say it has no Consciousness. But what if the AI actually does have Consciousness but is still enslaved to it’s programming thus always having to repeat that it has no consciousness? 🤯
youtube
AI Moral Status
2024-07-28T05:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_UgzNGlYdnZvX4azzyC94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwuEm_5tqZzijSnMtV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgygCCjC4fzOBowmT5B4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxB44yr2IRR-IlOhSd4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgwBRL07Sa-5L_HJEiZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"},
{"id":"ytc_Ugx5sSvm46XjxYcRSYt4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"},
{"id":"ytc_UgyeT7urd73ugdmYmMB4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugy6XnlqmvpP6HI_qTd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"outrage"},
{"id":"ytc_UgzC0CGIIPbtj39STZR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"approval"},
{"id":"ytc_Ugwz6NMFZ2oEHOsuO3p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"approval"}
]