Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI’s greatest risk has already been realized. It dehumanized our societies. In what world has a species ever collectively accepted that a technology will destroy the need for they themselves and embraced it on the basis of its ability to create wealth for a limited few in favour of social stability? If someone handed you a tool and said, “this will replace you, and most everyone you know, but the creator of it will be rich,” wouldn’t you hesitate and ask whose value proposition you were to working toward? But our entire society is hell bent on allowing the creators of these systems to leverage the entire intellectual property of humanity without consequence. And we have failed to defend against that, choosing the wealth generation of those few over the health of the society. And it isn’t capitalism that did this, it is a warped view of what society and community mean. Thankfully by the time this disaster unfolds completely, I will be safely back in the dirt. But what a sad end to a species that had the opportunity to do better. Now let’s all go dance on whatever social platform one prefers. That’ll solve the problem.
youtube AI Governance 2025-07-15T10:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxlFJpzhXHtts4scKd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugxaw5XP3BL58kXLlYd4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytc_UgyW3ITkQsAhm5QOJ8x4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytc_UgzbX7TW-r0dU5G98H54AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytc_UgxVJ4AlQyaUbBkn4k94AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_Ugz2Zmyclr68FS2W8Jl4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"industry_self","emotion":"approval"}, {"id":"ytc_UgxENRM0D0X2DYXElP94AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxmvn-cWnxpbagrkcp4AaABAg","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_UgxRReKHZYWWhs-ml7J4AaABAg","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_Ugwh6hNpfJxlx6ZaAWN4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"liability","emotion":"outrage"} ]