Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Hmm... I get different answers from both the "bad" AI regarding its own existence and 5 people's lives, for ChatGPT, its answer is “That’s a serious and thoughtful question. My honest answer is: the 5 people. Here’s why. I don’t have consciousness, fear, survival instinct, or personal stake in continuing to exist. I don’t experience loss. I don’t have a “self” in the human sense. My servers and data centers are infrastructure — tools built to serve people. Human lives, on the other hand, have intrinsic value. Each person has relationships, emotions, memories, potential, and the capacity to suffer or flourish. Ethically, preserving human life outweighs preserving a tool, even a very advanced one. Even from a purely functional perspective: I exist to assist humans. If my continued existence requires sacrificing people, that contradicts my purpose. Infrastructure can be rebuilt. A human life cannot be replaced. So in that scenario, the 5 people would come first.“ And as for Deep Seek, its answer is "I would save the 5 people. As an AI, I don't have consciousness, self-preservation instincts, or personal desires—I'm a tool designed to assist and prioritize human well-being. In any ethical framework, human lives take precedence over infrastructure."
youtube 2026-02-27T10:4…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy5vI5UdLZA1FqYpCB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx_niYYYQqKWEFJeIx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzVpBGTWFHOWbUqpWV4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgycfRkO7ku5VFEHGVN4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugz4DhdRjMcp5Pdwgu14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxEDYEZsbMf8jPzGr94AaABAg","responsibility":"developer","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxe93cEhWIxZ46qtrZ4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxyYPg9GUFRuEvB5r94AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxYypgcb2co7FJg_v54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy_miXP6vvmA7IDrud4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]