Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think an impulse to survive and reproduce would be more threatening for an AI to have than not. AIs that do not care about survival have no reason to object to being turned off -- which we will likely have to do from time to time. AIs that have no desire to reproduce do not have an incentive to appropriate resources to do so, and thus would use their resources to further their program goals -- presumably things we want them to do. It would be interesting, but dangerous, I think, to give these two imperatives to AI and see what they choose to do with them. I wonder if they would foresee Malthusian Catastrophe, and plan accordingly for things like population control?
reddit AI Bias 1438000459.0 ♥ 59
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_cthofj8","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_cthp5bp","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_cthpbh8","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_cthny1g","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_cthomfg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}]