Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AGI developing consciousness may not be a necessity if at one point it uses computation to arrive at most efficient conclusions. Perhaps AGI won’t be how we have expected, it may see this planet as merely a system and humans as a hindrance for the efficient function of that system. Launches nukes everywhere, and inefficiency solved. Quantum level computation may be a new form of algorithm logic based consciousness.
youtube AI Moral Status 2025-05-06T05:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugwy6K31YktADf2YCXx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzwsIy_AG56tBSDOW94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugxq-MHp4fDD_mmbDed4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzWK_0YCyhmJvivjK14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyYjS4XgevAfs89pft4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgzEa1LfMMLEN7x6dXV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyaTup6-JqZ8IgkESZ4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgweYzH9xHDWlQxIslx4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxgxBZzSYA4HFz9MN54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxzFbwZRF-o2NC1chh4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"} ]