Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
1 person should belong to 1 robot. There should be no 2nd robot. Each person should earn their own robot at the institute they studied at with different monthly salaries depending on their education. Imagine a robot working at your workplace and getting money on your card every month without working, see what happens with this idea. Evaluate my idea, it's good, it's not a bad idea. The sale of robots should be abolished all over the world. Robots should not be sold to the state by the entrepreneur. However, all new technologies invented should be created for the benefit of people. If robots are sold, life will freeze, this is no joke. Whatever the entrepreneur or the state produces, if there is no money because life freezes, people will not buy the goods produced by that robot because money freezes, see what will happen. Imagine a human robot, under some conditions, having his fingerprints matched to his passport and working at his workplace all over the world, and he himself travels around the countries and spends his life meaningfully and interestingly, and those who invented robots should be applauded. But if robots are sold, people will look at the inventors of robots as destroyers who freeze life. If the inventors of robots give 1 robot to 1 person, both parties will always benefit from the contract. Evaluate my idea.
youtube AI Moral Status 2026-02-10T16:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugygj31d9vUPJxpDlXZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgynXrHpeUn6mRBoXEF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwNn-NY4VFHYlckRGV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgypFsKh1xXr_UtgG014AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyD2Lla12dR-blctcp4AaABAg","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy91a0RcSvwjM6qFSh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugyq7Zoj3z6nGytGZo14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugw6mZpCECSPS55_4Kh4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzSkFSpNpZ5V_htXGh4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugww_PVaJliHhr-mQch4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"} ]