Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
If an AI passes the turing test more often than a human, shouldnt that be a prob…
ytc_Ugz1WRDw2…
G
I guarantee you 100% that the western countries will demand more control in the …
ytc_UgxdA2Ur0…
G
@jamesmarinelli2177 From the "universe's point of view" it wouldn't be game over…
ytr_UgxcuBqB5…
G
😂 oh she’s gonna clock it for him go girl!😂 She’s letting him know what time it …
ytc_UgwBO-3rF…
G
What are you talking about? Is it biased if an AI isn't racist or sexist or homo…
ytr_Ugw0rQDc2…
G
it's much easier then this. All you need to do is teach/learn triadic logic. As …
ytc_UgwxsXDy4…
G
AI was unleashed on the world by accident. The creators lost control a long time…
ytc_Ugy5SLWN_…
G
It’s ok, when they try to ai generate an argument all you have to do is tell the…
ytc_Ugynd9Psj…
Comment
Thinking of "the way humans think" as properly defined/convened, is itself a collective "hallucination". Talk of expectations of doom among AI researchers needs to mention how that community got prompted to it decades in advance by the likes of Yudkowsky. The whole discussion of intelligence is warped by a systemic bias of who naturally dominates the discussion: the successful smart guy reasoning from his own case.
youtube
AI Moral Status
2026-01-12T09:5…
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | unclear |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:26:44.938723 |
Raw LLM Response
[
{"id":"ytc_Ugwz5AH-bpg-7GSmMTh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"},
{"id":"ytc_UgxWGeKSHMWomMqtGdN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugzrby9lpXaCM_guGOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"},
{"id":"ytc_UgwgM6oLujUj0bb1OnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgyhqhqSyFrstcZ5bE14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"},
{"id":"ytc_UgyCcKFg-pYWMVS9iqt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"},
{"id":"ytc_UgzBaA2uob33O8PnpGt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"},
{"id":"ytc_Ugyqa1nK8PyY0PYgEUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"},
{"id":"ytc_UgxxwCYBwyFrCdMftc14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwnhkSmS9Nmt6i6LtB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}
]