Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
There is just no way that it's possible to accidently rollback too far in an organization that has top-level experts that WOULD KNOW to have every single backup option available. Creating backup's, and automatic backup's isn't hard for people working in IT. If this were to be true it's more than a monumental fuckup
reddit AI Harm Incident 1747041645.0 ♥ 5
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyunclear
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_mrv267f","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"outrage"}, {"id":"rdc_mrut4mz","responsibility":"company","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"rdc_mru7bs2","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"rdc_mrum80h","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_mrvvwd5","responsibility":"company","reasoning":"deontological","policy":"unclear","emotion":"outrage"} ]