Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
That's actually a great question! I'd wager 10 to 20 years for a similarly tinny clumsy version. This is because what we are learning now is that intelligence is easy, autonomy is hard. And so is manual dexterity. I mean, consider this: we can build AI to figure out the structures of protein molecules but we cannot build one that can pick its nose. And this makes sense if you see how long it took us to get here. Around 2 billion years to figure out operating independently as many-celled beings, and a mere 200,000 years to go from swinging from trees to discussing consciousness on reddit. We are now deconstructing in reverse, and we seem to be very close to figuring out the intelligence part, and then tackling the much harder problem of autonomy.
reddit AI Moral Status 1663158007.0 ♥ -3
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionapproval
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_iogs47x","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"rdc_iofhd0a","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"rdc_ioeqetc","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"rdc_iody3dx","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"rdc_iofb92h","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"mixed"} ]