Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Everybody is tripping about this prediction dates, specially about the physical world and robots. I follow these robots evolving for more than 20 years, and almost nothing changed (but the software side). They still have horrible battery, they have moving parts that break and need repair (different from phones and PCs), they can only do very specific things, they would need to be very heavy to have enough weight to do heavy jobs like building a house. The list goes on and on and on. For over 20 years the only thing I see is the same robots doing different dances and narrow jobs in completely safe and predictable environments. Wake up people, even smart homes like a smart lamp or curtain are not good enough yet. They break up, lose connection, do funky things. Most of the market ditch smart home stuff always after 3 or 4 failures and stress situations. There is no way in 2030 we will have humanoid robots building houses. They will be very very expensive and limited, or they will be a joke. The only way a prediction like this is correct is if super intelligence arrives and creates new meta materials and physics, batteries and stuff that we can’t even imagine to build the robots we can’t today. Chinese dancing robots and tesla robots are a joke, no normie would keep them after they fail to do the laundry the third time. Everybody is buying crazy magnificent 7 claims and believing them, while forgetting that they sell these dreams to inflate the stock market. Humanoid robots are FAR from doing your laundry, plumming, or building real world houses and structures. (The best we will get in the near future are better printed homes) Wake up. (And I’m a huge believer in tech, AI)
youtube AI Governance 2025-09-05T22:0…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgykP3n9tyxj7c8HK8N4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwZv5iUnA_faPp4l5t4AaABAg","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgzPNFQ-UalQT0O0fHB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugw3h9BXK9xpTAVorTl4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugw2O7CFCRebr2jJM-l4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFoYOlGdNUEDdwCWN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxKyN7WbSgSkZ6mW2F4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxCqhv7qJXGdiFvhb54AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgxRfAdrQHhXfwiNe7p4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxfblMZmy_wW_icUlV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"} ]