Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
> but that is an insane level risk for a company to take on. Is it, though? Because it's the same amount of risk that my $250k limit auto liability insurance covers me for when I drive. For a multi-billion dollar car company, needing to do the occasional payout when an autonomous car causes damage, injury, or death really shouldn't be *that* much of an issue. Unless the company is already on the verge of bankruptcy (and as long as the issues don't happen *too* often), they should be fine, even in the worst case scenario. The real risk they're eager to avoid is the risk to their PR. If there's a high profile case of their autonomous vehicle killing or seriously injuring someone "important", it could cause them to lose a *much* larger amount of money through lost sales due to consumers viewing their cars as 'too dangerous'.
reddit AI Responsibility 1755598492.0 ♥ 22
Coding Result
DimensionValue
Responsibilitycompany
Reasoningutilitarian
Policyliability
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_n9i5c43","responsibility":"company","reasoning":"consequentialist","policy":"liability","emotion":"indifference"}, {"id":"rdc_n9ie952","responsibility":"distributed","reasoning":"deontological","policy":"unclear","emotion":"fear"}, {"id":"rdc_n9hnanf","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"fear"}, {"id":"rdc_n9hftrt","responsibility":"distributed","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"rdc_n9hzids","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"resignation"} ]