Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We don’t live in the future. The future has already begun — faceless, voiceless, hungerless. It’s called the algorithm. It calculates, replaces, divides. And it’s doing it now. People aren’t losing jobs because they’re lazy, but because they’ve been declared unnecessary. Companies cut hands and hearts, not machines. Profits rise, wages fall, and humanity quietly disappears from its own system. They tell us to “reskill,” “adapt,” “keep up.” But with what? A world that no longer has space for slowness, doubt, or need? The crisis isn’t that AI is taking over the world. The crisis is that we already let it — without asking who still gets to belong. We, the people, demand something back: Not nostalgia, not sentimentality, but justice. If technology replaces our labor, its profits must belong to all. If data feeds on our lives, those lives must count. If profit no longer needs work, then work still needs life. This is not a call to stop what cannot be stopped. It’s a call to anchor humanity in the code of the future. Because the world doesn’t run on power — it runs on meaning. And meaning is what we make, together, with everything no algorithm can feel. Humanity is not “later.” Humanity is now. And that’s exactly why we must not be silent.
youtube AI Jobs 2025-11-02T22:4…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytc_UgwZwEMn9NveTlsrFVZ4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgygaQvFbBpCbCNkEKp4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugx1Y-7HwiNFUWwZgVt4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxMW4NWJeaD_ytraHx4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"approval"}, {"id":"ytc_UgwXmy5jnehxvRoUV-p4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_Ugx0SmlKnmH6xKRtmE54AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_Ugx2tX0AkYR6qchvFXt4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugyytu8Pt01O-d7cd1t4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_Ugw1AhUHJCg48_HhoHp4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgyImfS1ur2AecFjAyN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]