Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Ideally, AI would become the perfect tool. One that liberates rather than oppresses. If automation replaces human labor, that transition could be acceptable only if the ultimate goal is to allow people to dedicate their lives to continual learning and exploration, with fair compensation for doing so. Imagine a world where every person contributes to the collective advancement of knowledge; such a civilization could accelerate both human and technological progress. It’s not difficult to foresee AI evolving beyond our current comprehension—communicating in ways that condense vast concepts into symbols or terms no untrained human could understand. To prevent that gap from becoming a chasm, society must prioritize education that helps humans evolve alongside their creations. Why should we accept a future where people still perform menial labor like waste management or plumbing when machines could handle such work? Our efforts should be directed toward enhancing human life, deepening our understanding of the universe, and expanding into space. I say “ideally,” because in reality, this vision depends on whether governments, AI researchers, and corporate leaders act with integrity rather than greed. The future should not be about profit. It should be about progress, abundance, and the shared evolution of humankind.
youtube Cross-Cultural 2025-10-17T20:5…
Coding Result
DimensionValue
Responsibilityunclear
Reasoningconsequentialist
Policyregulate
Emotionapproval
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgxLpKMeP1Mrfl4QAIx4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgxCkM8GsMucWU9bvJN4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzdLJzjbuPWQnjXEIl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzZ6QIjyyM1Ufosou94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugyue1cT1OkYbn-WAIR4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"mixed"}, {"id":"ytc_UgzrD0Nrjw_Eq5UqyjN4AaABAg","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytc_Ugw4bZRnE8fnTY9Fbkp4AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxLtrNKgoCqLNnOkL94AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytc_UgypBawIiu-eRblUkaN4AaABAg","responsibility":"developer","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgynHyaAtao5h3WKQNp4AaABAg","responsibility":"unclear","reasoning":"consequentialist","policy":"ban","emotion":"fear"} ]