Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI pioneer here, 31 patents, lots of cleared work for gov, etc... It's not just the working class, between AI replacing jobs and AI project teams, the jobs to make more AI will go away, almost all software dev, and add robots in, amd the world's economic system no longer works. At that point, we need a universal base income and then diminishing returns on company revenue and those companies employees. This is a very difficult and complex economic problem with AI legal challenges that can't even be handled currently due to inability to prosecute based on scalability of the AI. AI can create exponentially more violations than the legal system could even detect, making consistent prosecution undoable. This is why I believe some of the current data ownership cases have ruled in favor of the AI companies, because they couldn't consistently enforce or even detect all prosecutable offenses. Artificial intelligence software running on computers and AI running on devices, robotics, and robots, will absolutely take jobs, and the places that say they won't, those will be the first to replace their humans. Even saying that AI wont replace humans is just something companies say so employees can pretend they aren't worried and the companies can pretend that they care about their people more than profit.
youtube AI Jobs 2025-10-12T16:5…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UgwXDKqfllvNCrW-mvt4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxLtR0QiO9F6POACyB4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxNntlCl1wc7aU_6M94AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_UgzoMy31cIvuv8zbvHd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgzLROtEM4-XyqMYJj94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugy8LN0uasoxtTJ-0RV4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugy8H9fAPxXYvNb0M7x4AaABAg","responsibility":"user","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugyqz3TFHgQTvL_8PtN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyQbcQh6rWxSQU7CPJ4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytc_UgzvGOptu6v4kHIt_t14AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"} ]