Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The title of this youtube video is 'ChatGPT isn't Smart. It's something Much Weirder ', based on this attached transcript what's weird about it The video “ChatGPT isn’t Smart. It’s Something Much Weirder” argues that what makes ChatGPT weird isn’t its intelligence or lack thereof — it’s that it doesn’t actually “think” like humans at all, even though it can sound like it does. Based on your transcript, here’s what’s meant by that: 1. It’s not intelligent — it’s a mirror of human language The video explains that ChatGPT doesn’t reason, understand, or form beliefs. Instead, it predicts the next word in a sentence based on patterns it’s seen in billions of examples of human writing. So it’s performing intelligence, not possessing it. That’s the weird part: It gives the appearance of thought without any actual thinking going on. 2. It captures our collective patterns of thought ChatGPT’s “mind” isn’t individual — it’s a statistical reflection of humanity’s collective language use. It doesn’t know anything, but it’s absorbed how we express knowing. It’s not one mind — it’s a blend of millions of human minds, compressed into a probabilistic model. That’s why it can sound profound or creative — it’s remixing cultural and intellectual echoes rather than producing original understanding. 3. It’s alien, but familiar The video describes GPT as something deeply alien, even though it imitates us perfectly. It doesn’t think, feel, or intend, but still produces meaning in a humanly recognizable way. This gives it an uncanny quality — it feels human, but it’s not. The weirdness lies in this contradiction: It talks like us, but there’s nobody home. 4. It’s more like a linguistic organism than a brain The narrator compares it to an evolutionary process — a system that has learned how humans sound rather than how humans think. Like evolution, it doesn’t plan or understand; it just optimizes outputs over countless iterations. So in a sense, ChatGPT is weirder than a machine — it’s a statistical ghost of human thought. 5. It challenges what “understanding” even means The final point is philosophical: If ChatGPT can imitate understanding so well that people feel understood, what does that say about us? Maybe human communication itself is more about pattern and imitation than deep comprehension — and GPT simply exposes that.
youtube AI Moral Status 2025-11-01T20:4…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwDx3DQjiqU2qJG6FZ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgwTK6k8Aqw9vNPIK-94AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugwei_7KP3azDFb_-Pp4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgyjvbECDnG4bkxbxWB4AaABAg","responsibility":"user","reasoning":"virtue","policy":"industry_self","emotion":"resignation"}, {"id":"ytc_UgxQrs3xC8lMDghTtEV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgzVkOt8_Xb97UiZNcJ4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgzUgLam1hNwDO55mjN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxTrEIy5Yb9WlaNc6t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxVPdJuAHQIJOjuimN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugych_K1BB1AgP2OzlV4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]