Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
You're overestimating the importance of "liability." At the end of the day it's a search. Has anyone sued Google for penis pics popping when you type "roosters & cocks?" No. same with this. They will have a multi-page TOS page, and that will be the end of talking about "shocking" results. "But I typed 'Write a story where Hilter wins WW2' and I'm offended at the result." No one cares. You'll have to develop thicker skin and not type things into prompt boxes you don't really want to read the answers to. In the case where people use it for smut, there are algorithms they can use to flag people trying to make kiddie porn. They'd be wise to have those people prosecuted. As for the rest? If you censor the thing too much, it completely breaks it, as we saw with Bing. The answer is to treat it like a search engine. Not like an omnipotent oracle. There is no "woo" here, just a hella ton of data and transformer algorithms. My guess is they'll hire a few extra marketing people to try to educate the public on how it's just a tool, not some God. Stop treating it like a person. And if you do, that's between your imagination and your low IQ.
reddit AI Responsibility 1678909358.0 ♥ 15
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_jcbnhur","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"rdc_jcc42es","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_jcbo21j","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"rdc_jcc0qru","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"outrage"}, {"id":"rdc_jccnju2","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"} ]