Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
First, IANAL, but I've taken more than my fair share of business law courses. It's my understanding that the responsibility comes in with the expected or reasonable use of the product. If a man kills his wife with a hammer, that's not the expected use. But if a man is using the hammer to do carpentry and it flies apart and kills his wife, that's when negligence and liability can come in. This is why deep in EULAs/owner's manuals you can find stuff like "don't do a terrorism with our product" or "don't wear this chainsaw as personal jewelry", so it can be established what is or is NOT part of the expected, reasonable use. If you sell an AI product and an enthusiastic sales guy says that it can answer any question for you, and the answers are wrong, very VERY wrong, that sales guy just opened the company up for liability. Would a regular person sue? Probably not. But if you are B2B, that other company has attorneys on staff and will gladly attempt to recoup losses. (Not in sales, but have had to deal with sales people who want the commission at any cost. STOP GETTING OUR COMPANY SUED!)
reddit AI Responsibility 1724507837.0 ♥ 19
Coding Result
DimensionValue
Responsibilitycompany
Reasoningconsequentialist
Policyliability
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_ljpdzwq","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"rdc_ljpt0n7","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"rdc_ljrhhjv","responsibility":"company","reasoning":"virtue","policy":"unclear","emotion":"outrage"}, {"id":"rdc_ljqscjr","responsibility":"unclear","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_ljsxhza","responsibility":"distributed","reasoning":"virtue","policy":"unclear","emotion":"mixed"} ]