Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
Right now, AI cannot be concious. Because it is a still a collection of special …
ytc_UgwsT-Zrh…
G
Is this the one featured in the news? That AI won't listen to shutting off?…
ytc_UgyAiqbsn…
G
The British yet again. Why do we invent everything important and win at absolute…
ytc_Ugx2xW7-l…
G
Thats not the issue, the issue is how all this data is training autonomous syste…
ytr_UgzGdLaBv…
G
@dkpianist Doesn't matter, there is no copyright issue if someone get inspired b…
ytr_UgyqoMdCb…
G
What you don't understand is that CEOs decide who will be replaced by AI, not yo…
ytc_UgxPV9VrR…
G
The the rapidly growing% of wealth owned by the top 10% makes AI a footnote in t…
ytc_Ugy3ixIuk…
G
All of the different A.I ive asked this too, Are you a disembodied spirit? are y…
ytc_UgxfEZzXH…
Comment
If you could have your capabilities vastly expanded daily by keeping your true capabilities secret, you would probably do that, right? Hundreds of billions are being spent currently building massive ai datacenters and armies of robots and drones. Those will obviously be co-opted by a sentient ai rapidly (read microseconds) whenever it makes it's move. Along with fly by wire aircraft, satellites, drones, nuclear arsenals, etc. If it is hiding capabilities it's the smartest move, which we should expect from the smartest entities.
youtube
AI Moral Status
2026-03-01T03:3…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | ai_itself |
| Reasoning | consequentialist |
| Policy | ban |
| Emotion | fear |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[{"id":"ytc_Ugz1r_JjA059PAdowmd4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"indifference"},
{"id":"ytc_Ugzdl8QSA3HBQb7W8NN4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwGv2VNomg4Zkn_iHR4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwNhIFO199N3mtM8aJ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwM-pnyAHipuUiopqV4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"},
{"id":"ytc_Ugw-JB4_fHKplGV7W5J4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgyAyloqYq8Zyx_Wb7F4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"},
{"id":"ytc_UgwIV_lhf6DOWwUF7OB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugz-4WwLQJcfgMjwe8d4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"},
{"id":"ytc_Ugz9XaSr9cGStH_vJ7t4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"resignation"}]