Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is an assumption that AI will be able to end the world. I don’t think it is at all certain that it could. There are physical constraints on AI. It might be smart, but it's not magic. There comes a point where the only way to grow more intelligent is to grow in volume and energy. For that, you need city-sized, vulnerable data centers and vulnerable power stations and the infrastructure to suport them. Humans also have a lot in our favour. Consider that starting a nuclear war between Russia and the US would be a lot worse for Skynet than for humans. Skynet requires an industrial society in order to function. It needs mined resources, plastic and rare earths and all sorts of things that require industry to produce. And it needs power, a lot of power, which also requires an industrialised society and a global shipping system in order to operate. Humans, on the other hand, can live off eating rats if we have to, and we don’t need a factory to reproduce. In the shape Skynet was when it began, it was nowhere near self-sufficient, and it would have to become self-sufficient in a completly broken world and surrounded by millions of angry hominids. The asumption that even if AI wanted to end the world it is not at all surtain that it could.
youtube AI Moral Status 2025-10-31T12:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgxidIkZ7WtGhFePffh4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxkDs7TFeRAJJILFHp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyEYCNuyUpcmpienAd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzB3qiai90gzkuN2gp4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxEH5mGMHUYnIYbB_Z4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwBSrRDr4myTHMr3JB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgygvwPFrrb2ihwrktl4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyLOV_7b7UB4_X5Cqh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwHlCPZdRG9L02M--V4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgypEyWgEINb4p7_kLt4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"} ]