Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
People have been thinking embodiment might be the key to AGI since computers were invented. I’m inclined to agree that it’s a factor. Jamming an LLM into a robot won’t change the nature of the LLM. It’s not built for that. It only does what we built it to do. We know what all its parts are and how they connect. There is nothing there that allows for consciousness. The way coding works is, you tell a thing to happen and it happens. Nothing happens if you don’t precisely tell it to happen. Until we make the conscious effort to add parts to the code which are designed to try and facilitate consciousness I’m disinclined to believe we just did it by accident
reddit AI Moral Status 1739924816.0 ♥ 3
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-25T08:33:43.502452
Raw LLM Response
[ {"id":"rdc_mdj1u91","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_mdisb60","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"rdc_mdixx7x","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"rdc_mdjir1p","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"rdc_mdj4ct7","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"} ]