Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The market will not take care of it, but it's really a question of how quickly things are automated, and how easy it is to scale the automation. Advances in robotics are the real problem. We're seeing in fields like translation and CGI modelling things that used to require a team of 10 now can be done by a single person. (+ software that costs less than a second person to run.) If it's actually possible to automate 99% of jobs, and it's also possible to provide those services for 99% less cost, then individual municipalities will be able to build 100x as much subsidized housing for example. You don't need buy-in from everyone if costs actually come down that much, you just need a critical mass of organizations (not just local/state/national governments but also NGOs) that are committed to providing high-quality social services for free. This is why people are trying to accelerate the AI timeline, because if advancements happen fast enough, like, say California spends something like $3 billion/year on public housing. If that's currently providing housing for 30,000 people, that's not enough obviously but say that we could provide housing for 3,000,000 people with that same budget, it starts to sound more realistic to actually end homelessness.
reddit AI Moral Status 1709529949.0 ♥ 1
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_kt6cu9m","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"rdc_kt9bijh","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_kt61sau","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_kt63x5j","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_kt5spsk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]