Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We should not give emotions and feelings to robots. Not every living species have the ability to do it. A robot not prepared for it may use it in the wrong way. It may change the way it would react without emotions. If you give it emotions, it cannot be controlled. If you make it logical, it destroys humanity. Because humanity can't control itself and destroys the earth in the process. If the robot would read the whole story of humans. It has a chance to come to a conclusion, that humans will end the planet badly with a 60~100% chance. And I am sure that the robot won't believe in the 40~0%. So it may not do what we want it to do. Let's just forget about giving robots intelligence. Please, humanity, stop. It will end badly. We are not gods, and it's not in our concern to create life for science or for joy. Just a bunch of codes that analyses or does only one task is one thing. Another bunch of codes that learn, decide itself and may change something more than just what breakfast you should have, is another thing.
youtube AI Moral Status 2019-06-18T15:4…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyindustry_self
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyOJfYxKPGjB1Ix7KN4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwVNtk5A7TC4hfNsjF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwoZKwVbJjpNDmfwNV4AaABAg","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_Ugwu2PuzXEwwlro1exp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxHc4Um5b97zb_H4hZ4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"fear"}, {"id":"ytc_UgwI2KA_rxa7MaIy4eR4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzp0W5FNyMcz-M74Gp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzM0zeb2XqYwQZ661N4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxRq0G8ckN4ASLxiNB4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzoLwzNi9nE5fJ_-u94AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]