Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@guncolony Problem with that is that OpenAI is basically Microsoft now, they're so deep in bed with Microsoft that they wouldn't be able to pull their themselves out even if they wanted to. Maybe the lawsuit against them could help loosen things up a bit, but with how capitalist the US is I highly doubt their legal system is going to rule in favor of non-profit and open source, the general American mentality is that those things are communism and bad by default, and surely corporate libertarian (for you Americans that have things twisted up, that's the conservative right wing) interests have huge sway in the US court. They're even buying up members of the Supreme Court, like that Clarence Thomas guy. Regular courts, they probably "own" most of them. So I wouldn't bet on that court case going our way, and even if it did the actual real world outcome is hard to predict, it might do nothing to release OpenAI from the chains Microsoft have on them. According to the agreement Microsoft no longer has any rights to the tech after AGI is achieved, but AGI isn't even legally defined so that's a bit of a grey area that this court case will hopefully have to take a stance on.
youtube AI Governance 2024-03-17T00:2…
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyliability
Emotionoutrage
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugxhm-p7rDbeJGonM3h4AaABAg.A13G5Lpm0ADA13gfmWd8My","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugxhm-p7rDbeJGonM3h4AaABAg.A13G5Lpm0ADA14vzCfaopw","responsibility":"government","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"ytr_Ugxhm-p7rDbeJGonM3h4AaABAg.A13G5Lpm0ADA14z_wRoAj9","responsibility":"distributed","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugxi5ce8EMCD9FE3_yR4AaABAg.A13Ez6-PLyqA13WOLhlLWe","responsibility":"none","reasoning":"contractualist","policy":"industry_self","emotion":"approval"}, {"id":"ytr_Ugxi5ce8EMCD9FE3_yR4AaABAg.A13Ez6-PLyqA17KyQnbW2P","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugwf45m2Q90YUSuxErJ4AaABAg.A13EcDzGajyA16vt7XpPEt","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgyHe-33IprUCbxgqwN4AaABAg.A13E1yxI70dA1Gd_bBpr4m","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgzoEE0Nmcw2-gyYSzV4AaABAg.A136JkmRALMA16Q1pb8MRZ","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugw1xsTprIl6aeaT5ft4AaABAg.A135mYaeu95A13ZLuH9Mob","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugw1xsTprIl6aeaT5ft4AaABAg.A135mYaeu95A13bAY3-G1c","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"} ]