Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Great episode — but here’s a thought: we keep talking about how AI will accelerate competition, replace jobs, create wealth concentration and so on. What if instead of doubling down on profit-driven models we flipped the script entirely and started designing a money-free society — where collaboration, shared resources and universal access are the default rather than competition for profit? With AI, automation and abundance coming faster than many expect, why not ask: if the basic costs of production drop dramatically (thanks to AI + robotics + networked systems), then could the next frontier be co-operative abundance rather than more market-based scarcity? Questions for the panel: 1️⃣ If we won’t need most jobs because of AI, what’s the point of tying meaning to competition + income? 2️⃣ Could a system where you’re free from money obligations unlock far more human creativity, purpose and wellbeing — rather than another race to climb the ladder? 3️⃣ What structural changes (social, tech, political) would need to be in place for a money-free model to not just survive but flourish in an AI-dominated future? I’d love to hear your take — especially from those of you who see AI as inevitable. If the traditional economy is going to collapse under its own weight, why not leap ahead instead of patching the old system? Thanks for hosting this conversation.”
youtube 2025-10-23T02:0… ♥ 1
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningcontractualist
Policyregulate
Emotionapproval
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgyyduUax8aXAaxWBCZ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxkegbUjbDGXRs1R7l4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxZwugD7yGkSe4x25B4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgySTUwVMxYZqUVI6s14AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyarYjr72BHEDgJpNx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxKlsyJZTcEhJRysnx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgwkiZi6ACF1Ypjv2_B4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx3h8OillGziL183h14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgySBj6_hE2RJPw6AZ54AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugxw4lP_wFRhxbG6B-V4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]