Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
A few studies have been made on "how likely is it that an extinction event hits humanity by 2100", as in, "the end of civilisation as we know it"; researchers estimate nuclear risk at 1%, and AI, or man-made artificial intelligence, at around 10%. So yeah pretty big risk indeed
reddit AI Moral Status 1685604675.0 ♥ 46
Coding Result
DimensionValue
Responsibilitynone
Reasoningutilitarian
Policynone
Emotionfear
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_jmg61cj","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_jmhlqd9","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_jmfwxpl","responsibility":"none","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"rdc_jmhboqz","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"mixed"}, {"id":"rdc_jmfqo2q","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"fear"} ]