Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It's inevitable that human civilization will eventually end even if you think it may take until the heat death of the universe though that's unrealistic... I would rather end with AI going forward then say something like a nuclear war... Or asteroid strike.
reddit AI Governance 1739122131.0 ♥ -6
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionresignation
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_o7ezc7s","responsibility":"company","reasoning":"virtue","policy":"liability","emotion":"outrage"}, {"id":"rdc_oi23z9w","responsibility":"company","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"rdc_mbum3mz","responsibility":"company","reasoning":"virtue","policy":"regulate","emotion":"outrage"}, {"id":"rdc_mbv9j0p","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_mbwe7jl","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"fear"} ]