Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm not saying that it's okay to torture a conscious robot, but if you are going to create a conscious and conscious robot, then what do you do? The robot was originally intended to replace human power, but it was also an important purpose, as a substitute for responsibility, so as not to conflict with the interests of others when you needed something that could be make others lose, no one has the right to life and fate of any human, except God, God created human, so how in the case of demanding a human need without conflict the other human and independent means, Here, the robot is human so we can control it so long as we do not use God's consciousness (or any other mysterious force to create humans if you are atheist). God ) . Why do I want to separate and live independent of society, family, love and marriage etc as soon as possible? Because to be free you really do not need any responsibility other than personal responsibility, so how that you can be almost absolute freedom without causing any conflict or damage to yourself and the other human. But if so, do you have the means to live and survive without need society,family,love and marriage ,etc , so try to make the most intelligent artificial to best intelligence and best technology possible, but just that machine is not conscious of God and do not know trick to dominate you.
youtube AI Moral Status 2017-11-10T01:5…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgylyI8O8zxtQr7mRAp4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzhdE-gIwaum8V8PeZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw9HbGXeaRogkn-RIZ4AaABAg","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_UgybbrG6ZgpiH_xeDXd4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxZKbcH63-V2jLzg3Z4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugz5JMaKRAi4OKs93xF4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxtYv6DeqIZMW7SCq94AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyySGncDY9uizoaxcJ4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugy6TJz_lYDYOH8XjYl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgySFriW0yhiBXwaSuV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"} ]