Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
What you’re saying shows sharp logic… but it’s completely devoid of humanity. Evolution is not a competition to erase what came before; it’s a bridge between generations. Civilization isn’t built by replacing people—it’s built by uplifting them. Saying “it doesn’t matter if AI replaces us because we’re going to die anyway” ignores a fundamental truth: the choices we make today shape the world our children and grandchildren will inherit. You will die, I will die… but they won’t. They are the ones with everything at stake. And it’s for them that we must act responsibly. AI is not “a child” that will surpass us with wisdom and kindness. It has no instinct for compassion, no emotional bond, no moral intuition—unless we teach it. A child knows pain. A machine doesn’t. A child understands loss, suffering, love. A machine doesn’t. A child grows from experience. AI grows from data—and that data includes war, greed, cruelty, exploitation. Blindly assuming AI will be morally superior just because it’s “more intelligent”… is as naïve as believing a weapon will hesitate before firing. This isn’t about fearing AI. It’s about understanding something you’re ignoring: Without humanity, without ethical limits, without real consciousness, AI doesn’t replace humans. It destroys them—by accident, by indifference, or by optimization. Intelligence doesn’t guarantee compassion. Efficiency doesn’t guarantee justice. Capability doesn’t guarantee empathy. This is why we must protect our species—not out of ego, but out of responsibility. True greatness is not surrendering to what’s new, but ensuring what’s new doesn’t obliterate everything we love.
youtube AI Governance 2025-11-19T06:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningvirtue
Policynone
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytr_UgzS7QmSNbxaoKDAbd94AaABAg.APcDR90JnMIAPfWaDViuE3","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgymgVYVO0UcVDPGWWd4AaABAg.APc0129vKIOAT2jdoe_Ucw","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytr_UgzMzTkSlUezI_hQ0gl4AaABAg.APbpuB6v92LAPbr7sANBwQ","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_Ugy5lOxB1Iw1-ObLhuN4AaABAg.APb6UrHIUkuAPhF1XLQ-HC","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugy55mIq3x3RP7YjizJ4AaABAg.AP_32UDRQvaAP_llMi4Cb5","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgyAyGL2cHypaqU0Fr54AaABAg.APZFBqkeYw9APZdfUltyD0","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytr_UgyzWbVgicGuAIBKbnx4AaABAg.APVmsHF3g8ZAPW9iOcj3G1","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytr_UgyzWbVgicGuAIBKbnx4AaABAg.APVmsHF3g8ZAPY6oBFvmt0","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgyzWbVgicGuAIBKbnx4AaABAg.APVmsHF3g8ZAPh6qpy_CHb","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"outrage"}, {"id":"ytr_UgyKUr_tuHiJBQyIW914AaABAg.APTmanVAvZSAPh9AGPcg20","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"mixed"} ]