Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
With an estimated price tag of $20k-$30k that seems high initially. But if you think about it, if it can do your laundry, pick up and clean the house, cook, lawn care, get the mail, feed let out and walk the dogs, secure the house, (just considering those, and not the hundreds of other applications) that’s like having a full time maid/nanny/chef/lawn guy/dog walker etc and cheaper than if you had a person do all of that. Eventually it basically pays for itself. Even if you don’t already pay for all that human help, imagine the amount of time you’d get back. All the energy you’d get back. The stress reduction. I think Elon is right. This is definitely a game changer. It will eventually become stupid not to get one (apart from the potential iRobot AI/Terminator takeover scenario that Hollywood pushes that so many are worried about). After seeing this thing walk, slowly shuffling its feet like Joe Biden, it’s not as intimidating. None the less, there will always be concern about its potential danger to human life, so if this is going to be a true game changer, then robot makers need to have a feature to help the masses alleviate that concern. Like a physical killswitch that cant be disabled in the code by a hostile AI, and also can’t be physically tampered with, without it immediately shutting down the robot (in case it tries to physically disable the killswitch). Not that hard for these engineers to do. Granted, even with a physical, tamper-proof killswitch, there would still be concerns that it would kill you in your sleep, or poison your food. So for faster widespread adoption and also long term human safety, there’s still some things that need to be addressed (and marketed) to overcome some of the fear.
youtube AI Moral Status 2025-01-08T15:3… ♥ 4
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugyow0nZSpENfTPyaKp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyyrpWO4tqrBrFayW54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgwLCF6dc_CJgpi_YsB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgzA9XUs9X3upY8zGtN4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxyLSE0yngkcNN5_rJ4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugwea-RAdIIJQrcYjYF4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx6nyEMAzDdTHjUL914AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwe-WV2Z5UxxvBGj094AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"fear"}, {"id":"ytc_UgxWwzUDm0yW8stzoJ94AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugzy69UeT3K8Reasrjx4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]