Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Wouldn’t you have to program an AI to try to survive? Like I get you’re making a joke, but AI doesn’t have any desires beyond what we give it. I don’t see how AI would ever have survival instinct. I mean regardless of what you think about sentience and consciousness, there’s no evolutionary force driving a desire for survival. No pain, no fear, nothing.
reddit AI Governance 1734336117.0 ♥ 3
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningunclear
Policynone
Emotionmixed
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_m2agxub","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"rdc_m2atrdj","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"rdc_m2b59lc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"rdc_mfgm5tm","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"rdc_m2b2z2d","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]