Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Locally running, less restricted models are likely to be the standard once these things start rolling out for general use. The cloud based models make for a good general purpose demonstration, but the hardware requirements to *run* the models are not as dramatic as their creators are implying. There are already folks nipping at GPT-4's heels with local models running on mid range gaming desktops. Once optimization starts to ramp up, there's no reason we couldn't all be running our own local GAI modeled to our specifications in terms of restriction and censorship. That's going to be damned hard for China to deal with.
reddit AI Governance 1681228339.0 ♥ 10
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policyindustry_self
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_jfv6f1p","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_jfvaju3","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_jfu29sg","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"rdc_jfuaeis","responsibility":"none","reasoning":"unclear","policy":"industry_self","emotion":"approval"}, {"id":"rdc_jfwww8a","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"} ]