Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
/u/ComplexExponential asked: >4.And continuing on my above point, if will is seen as a programmable phenomenon, then can a genetically engineered being without a will be considered exempt from all ethical considerations? And if not, then doesn't AI deserve equal ethical considerations? Something that doesn't have a will is probably not a moral agent. If so, we couldn't hold it responsible. However, it may still be an 'innocent' threat, like a rock on top of a building that could fall and hit someone on the head. To that extent, we could take measures to protect ourselves against it.
reddit AI Moral Status 1487175073.0 ♥ 2
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_dds1kol","responsibility":"none","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"rdc_dds1ott","responsibility":"unclear","reasoning":"deontological","policy":"unclear","emotion":"mixed"}, {"id":"rdc_dds7hhz","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_dds1oe9","responsibility":"ai_itself","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"rdc_dds3r39","responsibility":"user","reasoning":"mixed","policy":"liability","emotion":"fear"} ]