Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
asking the wrong questions, the main question, what happens to humans when AI and robots take over and work for us, but we make no money to afford these things. The new butler robot costs $20,000, the telsa car costs $100,000, only 10% of the entire US can afford that, and still live comfortably. What happens to the rest of us and the future if we can't afford it. More important, when the mobile ID kicks in, and you MUST have it to buy food, what will happen to most of us. It's like someone sat down, watched all the futuristic movies ever made, and decided, I want to make those things come true, without pondering the consequences. Be careful, we either become Galaxy Express 999 or Cloud Atlas. Sometimes the best choice, is to choose not to play at all. How about a nice game of chess.....
youtube Cross-Cultural 2025-11-14T22:0…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzCgiLrGshOrXiuFOl4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz0T-N6DIC_CetKEah4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugy5oYZ0RhCL5n2DAnJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"outrage"}, {"id":"ytc_Ugyd7-G37GD5gFwJUZV4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzFqxqkCGcix7WP5gF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxQtPUw0-9v5ua176l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyHUmeOwYPdRJV9Fu54AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgyKdqM8kuPPhV5WXTp4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzxFONxRfWv5ibSs0N4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxKk7_zR0Yx_2JFHWx4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]