Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Dear Peter, I’ve been following the "Abundance" thesis for years, but your recent OpenClaw debate highlighted a fundamental economic contradiction that remains unanswered. On one hand, you predict the inevitable rise of companies with $100 trillion market caps within the next decade. On the other, you argue that AI will usher in an era of absolute abundance, where energy, intelligence, and labor are demonetized to near-zero cost. Mathematically, these two realities cannot coexist. A market capitalization of that magnitude represents the present value of massive future cash flows. For a company to be worth $100 trillion—roughly equal to today's entire global GDP—it must capture an astronomical amount of value from the rest of the economy. It requires wide moats and high profit margins. Conversely, your abundance model relies on the marginal cost of replication dropping to zero. If AI truly makes everything free, competition should force prices down, collapsing profit margins and valuations, not inflating them. You cannot promise a deflationary utopia where goods are free for humanity, while simultaneously promising a hyper-capitalist future where a single entity captures more wealth than currently exists. Is the future free, or is it owned by a $100T monopoly? How do you reconcile this math?
youtube 2026-02-06T16:4… ♥ 63
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_Ugx0USz-L7apknavdVR4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_Ugyn1FGb3LatqP5lyS14AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgzBHVkbNaX2eraq7s94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwXeDERcxgw1U6bD-Z4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"mixed"}, {"id":"ytc_UgygpzwB563bpAYBe-l4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgzZkDKIq-tp0vmCQQR4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyPCkiteQ4UO9vEjqZ4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgyGUTN6c8zD9viaOXl4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzMI4x4N5hHqhVifmR4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"industry_self","emotion":"approval"}, {"id":"ytc_Ugw579cE_BfZBgH8AO54AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"} ]