Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
During a talk at NYU last October you stated a potentiality view of moral status, where the moral status of a being is defined by the moral status which it could or would eventually obtain in its later development. You said that intelligence is not the measure of moral status but that other things, like maybe consciousness, are. How would you apply this potentialities view to an approximately human level AI with the potential to recursively self improve towards superintelligence and self modify to the point of having exceptional moral interests, such as a utility monster? Wouldn't the seed AI instantly become the most valuable being in the universe upon its creation?
reddit AI Moral Status 1487173512.0 ♥ 8
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[{"id":"rdc_dds0ck6","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"},{"id":"rdc_dds3a6a","responsibility":"none","reasoning":"deontological","policy":"unclear","emotion":"indifference"},{"id":"rdc_dds2y55","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"},{"id":"rdc_dds4pao","responsibility":"none","reasoning":"mixed","policy":"unclear","emotion":"indifference"},{"id":"rdc_dds5e0b","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}]