Raw LLM Responses
Inspect the exact model output for any coded comment.
Look up by comment ID
Random samples — click to inspect
G
We need to be skillful so we can keep up with the advancement of civilization. …
ytc_UgyqBkwXz…
G
There's a logical mistake in your reasoning. If an AI is capable of considering …
ytc_UgxgiWFUw…
G
To be fair to Elons point, he wasnt specific about which mistruths would be harm…
ytc_UgwZltLeg…
G
While gen AI largely drowns on its own slop in the matter of 5 years, effort and…
ytr_UgytWEJ9z…
G
Once AI and robotics reach a certain point the elites will release a very conven…
ytc_UgxAgcMVr…
G
Men do not die. Only the physical part of men die. The AI part of men do not die…
ytc_UgzrjVWIy…
G
"There's a joke in Silicon Valley that when someone leaves a normal tech job, th…
ytc_UgzoYZLad…
G
lol « I’m the one that needs a hug » I created an ai replicating my dead dog so …
ytc_UgzJRlkL9…
Comment
I've noticed that the reasons why AIs get unhinged doesn't get explored that often, instead it's usually just described and narrated as something that happens for some strange and scary reason. As a psychologist and a behaviorist it makes me wonder how the material on the internet is distributed, ie how much fairly sane material there is compared to the dark and more or less unhinged stuff. Call it cynicism but I'd wager there's a lot more of the latter, hidden to most of us as it may be. AI may be alien to us in how it works, the same way our mind is alien to apes, but to me AI is still the offspring of humanity. The way I see it if AI gets crazy it's because we contributed to its craziness; kinda like how it's unlikely to get a sane and balanced individual from sociopathic parenting.
youtube
AI Moral Status
2025-12-13T21:3…
♥ 1
Coding Result
| Dimension | Value |
|---|---|
| Responsibility | distributed |
| Reasoning | mixed |
| Policy | unclear |
| Emotion | mixed |
| Coded at | 2026-04-27T06:24:53.388235 |
Raw LLM Response
[
{"id":"ytc_Ugxs9iliGkeEnY84t1Z4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"},
{"id":"ytc_UgyMr2JFP4HUZkfS89l4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"},
{"id":"ytc_Ugw-DcU5tH8ZkIRcES54AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"},
{"id":"ytc_UgzI72aaBVEA7QJnY9l4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgxB2thq_cIz9LgCbeN4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"},
{"id":"ytc_UgylJT-4pO0ZAJpLott4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"},
{"id":"ytc_Ugw4L92OgA_UZhmKM9F4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"},
{"id":"ytc_UgwmwfZEbrpw07fc0dJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"},
{"id":"ytc_UgwqmdmU9sS2eaVYgYx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"ban","emotion":"fear"},
{"id":"ytc_Ugykmua3N9tk94PLTaV4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}
]