Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
We don't need to imagine what the future might be like. It's here now being created and for humans we have an increasingly divided society with growing inequality. Humans have not evolved emotionally much beyond apes but we have created incredibly sophisticated technology which is now evolving super fast. We know from past and current human behaviour we aren't able to stop things because of fear someone else won't. We know that while we can see some immediate threats we struggle to see systemic threats. We also know that when we try to get round something we create another set of problems. So for example we hope to stop climate change, we imagine having renewable energy but in fact we create things that require yet more energy. There are many examples we can examine but for the most part the indications are not good. It is possible that humans may willingly allow Super AI to take over because we see now that humans allow this to happen in social media with basic AI. I see two possible futures, one is very concerning, the other requires a complete re-think.
youtube Cross-Cultural 2025-10-06T15:1…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyunclear
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgyiZf0_fDK7Wny5vjp4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_UgwgtjJKXja-l0DmO9p4AaABAg","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgyO3Dzg-4VmNDqhtaF4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"ytc_UgyybmZU9NtAGGyrw8t4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxdrmOPk9Z6xEJzlpF4AaABAg","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgzuJQvbMKIZ5EuopjR4AaABAg","responsibility":"developer","reasoning":"mixed","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugx30SqMLdMF5CnjWnl4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzIegTqNvrcwjHbns94AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugzl-TE7uwiWEa9OYP94AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_Ugze4OW7_to-EYlT1wR4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"approval"} ]