Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
another fool who thinks robots can be like humans. sure they can be programmed to act like a friend, but what makes that better than the "fake" people we already have right now? just like the fake human, it doesn't actually have feelings for you, it just shows them. and what use would school be for a robot that can become a top scientist in a single day? what use do creativity and art serve to a robot? why would a robot care to start a business? for money? the same money used by humans to ultimately get food and survive? they don't need food. they don't need anything but batteries and maintenance. but why go through the trouble of starting a business to get the money to buy batteries and maintenance, when your creator is already doing that for you? making real life synths is the most impractical and inefficient way to get work done. they'll have all the issues that humans have: hesitation, free will, malicious intent, bad choices, etc.. the only point in making them would be to make them secretly, so no one knows it's a robot, and like the Institute does, you can replace people to the point where you have an entire population that is literally completely under your control. but the common people will never know that their best friend is a robot, only the government will. but for the blue collar industry, what's better? a human that gets tired, hates you, and has the possibility of quitting, or a robot that does its job 10x better than the human, doesn't care what you say or do to it, doesn't need money, and can work 24/7? yeah, a robot sounds a lot better.
youtube AI Moral Status 2017-06-04T18:4…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgifEAwtbjabXngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UggIGghOgJWG2HgCoAEC","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgghyOIVkcBXGngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghMBbJjpWQIVHgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Uggm500G26ICangCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgjqGk5qSHkns3gCoAEC","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgghPCmF7l-eWngCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugh581ni_3YUDXgCoAEC","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UghqQAuaEZ4xM3gCoAEC","responsibility":"distributed","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Uggge1rjOHpU9HgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"mixed"} ]