Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I can see why he agreed to be on this guy's show. None of his answers made any sense. He's trying to make life after AI sound great, and it won't be. It's a check whether it's UBI or tokens. Neither will fill the void he and the others developing AI are creating. Everyone will have too much time to sit around and think of ways to hate others. Who's going to settle for a certain amount that can't be increased with OT so you can have nicer thing ? What reason does Sam have for caring about any of us at all? Robots will he able to mine and plant food, and the planet's resources will last forever, with 8 billion fewer people using them. He's keeping people calm so he can get AI to the point he doesn't have to lie anymore.
youtube AI Moral Status 2026-03-16T18:2… ♥ 1
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgxDH4I00pEQiTqNVwl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgysgaxRySe2664aTqt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugx0mDHNZpWtCLRtK3J4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugwc9AETtnp2NcGjvOB4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy7JY5EJu6WYxEEOBR4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyCp8wX_If0rdf1fHh4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxPTjaV6HbctVYoXWt4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxM3oVjRU4ofFEXNAB4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugzv9m5IH9n1ls4EfPV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgxZEJIulBXsEi35Owt4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]