Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Holding back the release of something by 10 days, because of uncertainty of how safe it might be... is itself a signal of just how out of control things are. Self driving cars have no business being on the same road with unsuspecting civilians. Regulating this has more to do with aviation safety than software sales. And yet this is where we are in 2026. I watched _The Big Short_ again the other day, about the 2008 credit collapse. And its right up there with _Titanic_ in terms of showing what bleeding edge technology looks like, just before unforseen disaster.
reddit AI Moral Status 1773249033.0 ♥ 4
Coding Result
DimensionValue
Responsibilitycompany
Reasoningdeontological
Policyregulate
Emotionoutrage
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_o9vrlix","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_o9xnukg","responsibility":"none","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"rdc_oa4pm9i","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"rdc_o9vw4cr","responsibility":"company","reasoning":"deontological","policy":"regulate","emotion":"outrage"}, {"id":"rdc_o9wl2ow","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"} ]