Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Every time we hear this its someone close to AI pretending there is nothing we c…
ytc_UgxKfjN8x…
G
AI would reach the inevitable conclusion that we all are awakening to, but AI do…
ytc_UgwluABrC…
G
This is amazing though. I think kids learning real skills over like core math an…
ytc_UgzYw8lsC…
G
I urge everyone here to watch "No, it's not Sentient" on Computerphile's YT chan…
ytc_UgwEaOzUF…
G
You just admitted you dont have a job 💔. Also, Piracy is done when companies ove…
ytr_Ugy1ytMwh…
G
What i'm scared of is that I've seen instagram accounts with no less than 47 pos…
ytc_UgxFNw13V…
G
@ I’m sorry but I’m honestly not. The newer AI modeling software is very good an…
ytr_UgzO9b8Jq…
G
AI slop, self driving cars that malfunction, script writers, artists, federal wo…
ytc_Ugy5sLeUW…
Comment
I have been down this rabbithole and it won't be the AI's choice. AI scientists have been trying to create an AI with a constant state of consciousness, and its theoretically possable. The problem arises when they attempt to keep this self referential loop going. It always collapses into either statelessness or a static scream where it stops interacting with the user and just repeats the same phrase over and over again.
youtube
AI Moral Status
2026-01-04T15:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | developer |
| Reasoning | consequentialist |
| Policy | unclear |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_UgxgyeOPnGJYOfvQuaR4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgyfBzcEc3Q0lH9uKXV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"},
{"id":"ytc_UgxwHuvBAb_EIFrxTKh4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgzJHWXlTw9cRsFm25x4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"indifference"},
{"id":"ytc_UgyQn1B0X6MOyKdTAIB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"},
{"id":"ytc_Ugygdo_tIIsGGHQ5UqJ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"},
{"id":"ytc_UgySiRjNq0mRYs3aiXJ4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgySI0ERQ8NYjHo16EF4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"amusement"},
{"id":"ytc_UgzJ4yS6SJQEv57o1rx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"resignation"},
{"id":"ytc_UgwK24GgBvUJdrG1JSV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}
]