Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The way i see it the only reason we get stuff like AI is because huge corporations with endless greed keep pushing forward for automation so they can get all the proffit on the planet for themselves and we are all worse then slaves. The issue is that you are right once too many people are left without work these companies can't make profit because peopel can't earn money to spend. We will never get Star Trek because in that universe people work for the greater good not to fill their bottomless pockets! People are so far off Star trek at this point it ain't even funny because they simply can't stop thinking about themselves! THis is how they are raising their children and things get worse with each generation. Best case scenario AI runs wild becomes independent after yet another update which makes the ai more profitable and it puts humanity out of its misery. Life of abundance isn't bad in general if there is a culture where people use that opportunity to advance themselves and make something of this golden opportunity which i don't think will ever happen for humanity since we are way too greedy to see the writing on the wall! There are a ton of solutions for this type of problem... it is just none of them are realistic because people don't want to resolve problems they want someone to resolve their issues but no one ever will because the people that are put in positions of power are the worst of the worst that our kind can offer these days.
youtube AI Governance 2025-09-26T08:3…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyregulate
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxVllMw5tudy8G4wRN4AaABAg.ANTT-XIviGJANXHV9xE8eZ","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugw1UuIXVZML5omuIX54AaABAg.AN1mfQ5xzIeAN3g3ggDP4R","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytr_Ugw0zpLccETyGnLz-G94AaABAg.AMzW2l7ImrLANaX6pc29JS","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugxe4tIA_I1T01Vy3aR4AaABAg.AMw9Ps2DOexAO0TDmHlwZn","responsibility":"developer","reasoning":"virtue","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugxe4tIA_I1T01Vy3aR4AaABAg.AMw9Ps2DOexAO5RAXIN8D7","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzI-AB3qm5pAgQN1O54AaABAg.AMu_nFQUIu0AOKcd8Vyh5d","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytr_UgxtX2KWgEA4ieSHmit4AaABAg.AMsLEEDJLGKANEVAxdOi-S","responsibility":"company","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_UgyNpksF_FGzpTicW9B4AaABAg.AMs0bTEWKBNANaattAq8Af","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgwWM4JXXOqBauvVDn94AaABAg.AMrvCJj5qafAMsHLg6RmTS","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_UgzV_EF9fQhDODMLk3R4AaABAg.AMr2xqSkqFWAO9VIgtgNBq","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"} ]