Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Thinking of "the way humans think" as properly defined/convened, is itself a collective "hallucination". Talk of expectations of doom among AI researchers needs to mention how that community got prompted to it decades in advance by the likes of Yudkowsky. The whole discussion of intelligence is warped by a systemic bias of who naturally dominates the discussion: the successful smart guy reasoning from his own case.
youtube AI Moral Status 2026-01-12T09:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_Ugwz5AH-bpg-7GSmMTh4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxWGeKSHMWomMqtGdN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugzrby9lpXaCM_guGOd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwgM6oLujUj0bb1OnJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyhqhqSyFrstcZ5bE14AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"unclear"}, {"id":"ytc_UgyCcKFg-pYWMVS9iqt4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzBaA2uob33O8PnpGt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugyqa1nK8PyY0PYgEUJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxxwCYBwyFrCdMftc14AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwnhkSmS9Nmt6i6LtB4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]