Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
🔁 History repeats itself — especially when new technology arrives. Every time a groundbreaking technology enters the scene, we hear the same chorus: utopia or apocalypse. It’s a cultural echo — fear and fascination dancing around “the new.” Here’s the pattern, again and again: 1. Panic vs. Salvation New tech always triggers extremes. Some see salvation (“AI will solve everything”), others see doom (“AI will take over the world”). Two sides of the same coin — uncertainty wrapped in strong narratives. 2. Historical déjà vu – The printing press was called the devil’s tool — it spread “dangerous ideas.” – Railways were blamed for ruining health and disturbing nature. – Electricity was feared for corrupting morals and disrupting sleep. – TV and video were accused of dumbing us down and isolating us. – The internet was hailed as democratic — and condemned as destructive. Now it’s AI’s turn to wear the halo or horns. 3. Exaggeration has a purpose These reactions aren’t just noise — they help us think aloud about ethics, consequences, and boundaries. Even dystopias are mental fire drills: What could go wrong, and how do we prevent it? 4. Tech doesn’t decide — people do The idea that technology drives history on its own is flawed. AI won’t “take over” unless we let it. It’s human choices — political, economic, cultural — that shape its role. 5. Reality is always more complex History shows us that tech rarely delivers on its wildest promises — good or bad. It doesn’t save or destroy us. But it does change us, deeply and gradually. Bottom line: When people say “AI will take over the world,” they’re really voicing a deeper fear — losing control. Not just of machines, but of society, power, and meaning. And that fear? It’s nothing new.
youtube AI Governance 2025-10-06T21:3…
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionmixed
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwF-yWMq9pfzjWYvHp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxozHo2BJ16_JRwE7l4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzCYo6DAyjANDsWsHB4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzIhsun3bRQQVki81V4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgyDUpmWfd9YINeJd3Z4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxcZbyo3mmavl5CONR4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgyTRo_ImU4Hy3Opz6B4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugy5H8HwWQQCEA4hNjp4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_UgwV1ks_m_gHSVJgREh4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_UgxdDOlAWhJTWsSHqSF4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"} ]