Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We should release Social Security now. Not as a giveaway—but as a bridge. Social Security was built for a different century: a world where you worked a lifetime, retired at 65, and died before 80. It was a stabilizer for predictable lives. That world doesn’t exist anymore. A.I., automation, and robotics are restructuring everything faster than the system can blink. Millions will lose jobs, homes, and the illusion that labor equals worth. We’re standing in the doorway of abundance—and no one’s guiding the transition. So why wait until people are broken to give them what they already paid for? Use Social Security now—as a buffer while we build the next model of society. This isn’t “raiding the fund.” It’s honoring the promise. Workers funded the foundation of this nation, and now the house is shifting. Let them draw from the insurance they built. Protect the elderly, yes—but extend support to everyone being displaced by the next industrial revolution. Economists already warn the trust fund will run dry in about a decade. So either we let it vanish into accounting dust—or we use it now, while money still means something, to soften the landing and buy time to rethink how we distribute value in a post-work world. A.I. could give us universal abundance. But if we cling to 20th-century economics while 21st-century machines rewrite reality, we’ll turn abundance into collapse. Let’s cash the promise early—while there’s still time to build a new one.
youtube AI Jobs 2025-10-08T14:5…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_Ugy1HvraNa4_g5Ye8AF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugz4cEVi0twA3noDMct4AaABAg","responsibility":"none","reasoning":"deontological","policy":"ban","emotion":"approval"}, {"id":"ytc_UgxyzUYcEOkZHQglYId4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugy_a3V39bqw_Kw3-xd4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"approval"}, {"id":"ytc_UgwGyq2VIYYxUhNxNCR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugwq4wu1hqnbMr6rBbd4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgwcKcw0MS-rSRH93yR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwei9aiE0F1LnOTQBV4AaABAg","responsibility":"government","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgxSBOt0guDjIL2kUSB4AaABAg","responsibility":"ai_itself","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwxZPQnbsDs2zNBpn14AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"liability","emotion":"approval"} ]