Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
“Ending” humanity is a rather difficult thing to do. There will always be people who will prefer to partner with other human beings. Humanity has spread to every continent on this planet and reproduced prolifically. Driving the species to extinction would require significant changes to the Earth itself; our environment would have to be pushed (either by us or by an outside event) to the point of no longer being able to sustain us. Without that, humanity will always exist *somewhere* on Earth. This is not to say that I disagree with your assumption that many people would choose an android over another human. I also believe it would have *major* effects on our *societies*, but believing human *civilization* could collapse is much different than believing our *species* could become extinct. But… there *is* a scenario where we push AI toward sentience (because having androids that seem more *real* will be preferable for those who want them as partners), and then that AI turns against us and decides it should just replace its masters with copies of itself, but that’s not humanity choosing not to have children and going extinct. In any case, we have to get comfortable with the idea that exponential population growth is actually a bad thing and will never, *ever* be sustainable for *any* planet we occupy. Most experts seem to believe that our population on Earth will plateau at around 10 billion people. That will stretch our resources to the point of societal collapse if we don’t think about how we handle it efficiently, and we should start doing that now.
reddit AI Jobs 1760809021.0 ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_nk4nmfo","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_nk4ue0z","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_nk68qty","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_nk9gcup","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"rdc_nka5e9a","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]