Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The term for what Descartes thought of other species isn't robot, it's NPC. Fake consciousness. But here's the thing, something to consider. In that long human history of denying the underlying true consciousnesses of others, what if they were actually right some of the time? You may look back with scorn upon them now, looking through your bias of modern ideals and dismissing certain notions as taboo to thought, but what if they were sometimes right. I have come to be certain that Descartes was right, just about the wrong target. And in your examples of the past, one stands out as not like the others, if you look at what they put into law once they had power. That one is giving women the right to vote. What have they done with that power? They've consistently voted for social welfare, for artificially advantaging women. They've made it so that if a man gets beaten up by his wife or girlfriend and calls the police, they'll arrest him when they arrive (the duluth model). They've made it so that if a man gets divorced, he'll never get custody of his children, and he'll have to pay her a stipend for the rest of his life, and he'll go to JAIL for the rest of his life if he runs out of money because the back child support and alimony due will just accrue and he won't make any money in jail. Time and again for 100 years now, every time they have had an opportunity to rise to the occasion, women have demonstrated that giving them the right to vote has been America's biggest mistake in its history. And in my own observations, the sheer simplicity of consideration and selfishness of women and their consistent denial of men being anything but NPCs, has made me more than a little suspicious that they are hiding their true nature in their accusations.
youtube AI Moral Status 2019-07-13T02:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyG_i1SdzX5MkIv0xB4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgzmzrCUzA00LSujslR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugwzk_j3gcrcTyvvzKp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzbOFFjan08c1oWicZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgymsUti5WQdRjl6TCx4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxZOXeRpqZYdFs7BfJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgxVFFxd9cksr2LoUJp4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz-TP6ttYj1uqEiARN4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugze9f7zHUdxBlupawJ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugzf8o7AdIc6v-N_zyB4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"approval"} ]