Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
AI uses too many resources. We are building a human being substitute, that could take over the world. We, as humans, may lose the ability to create. To know what's good for us. If machines start to decide, chocolate isn't good for humans, we lose chocolate. And the discussion didn't talk about what might be a turn on to AI. Or if AI disagreed with itself. I believe it will just be like a human mind, and maybe have wars, love, religion, etc. The knowlege it acquires will just be compartmentized. It may even get lazy, not producing, or giving wrong answers as discussed. So yeah, in the beginning it may somewhat help, BUSINESS. But in the end will likely take control. The so called rich, won't be enjoying their lives, I guarantee this. And I'm not quite sure what will happen to the average human. We live for creativity and for someone to appreciate our skills. And what will we do with all this spare time and not getting paid. I believe we may create separate societies, and may find our enemy to be AI. And AI will likely control us and itself with no human influence. Androids could surely walk the beach, pick up a shell and analyze it. And AI's database has our has our whole history records. And it knows how to duplicate itself. So for the future of humanity, this needs to end. What will be the difference to AI's machine, between us and mosquitoes?
youtube AI Moral Status 2026-03-21T20:0…
Coding Result
DimensionValue
Responsibilitydeveloper
Reasoningconsequentialist
Policyliability
Emotionfear
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgyekrAhoz83uiVir914AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxMUAtRH0WfnLVadUV4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwdWTN_5iVgEoUOARF4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxoJ-_oczNp5I_5QwR4AaABAg","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzuYkjJGlIp93PCIid4AaABAg","responsibility":"none","reasoning":"none","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugxv-l_zW8s9Ud7qppB4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugy3iWUKupSeuxqoWz14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw669t6F0hCNsbQun94AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgwOWbQ761cB-4YDbEp4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytc_Ugy4lwgNXJiv59YhwfN4AaABAg","responsibility":"developer","reasoning":"contractualist","policy":"regulate","emotion":"approval"} ]