Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I think I agree with everything except one: you can define 'intelligence' in many ways, and maybe for some AI is a misnomer, at least in the current era. But I think some core property of intelligence is that it is a form of problem-solving, decision making, and learning. Artificial intelligence is then to me nothing more than the quest to automate intelligence, i.e. to automate problem-solving and decision making. When you say 'is a actually a problem of all automated systems", I agree, but in this case the system is one that is built to automate high-level thinking, decision making, understanding, generalization, scientific research, coding, writing, ..., and those are the very thing on which all of human industrial productivity are built upon. When the thing that is automated is industrialization, non-intended behaviors become bigger than what they would be in small localized automated systems built to be good at one thing. If AI is used, anywhere, then it automates, and it automates, as you said, in a way that is hard to understand. I would also say that "superintelligence" does not need to have anything to do with sentience, or the notion of intent; I don't think the issue is anthropomorphization at all, except in that we can be fooled to think we can trust it to think in the same terms as we do.
youtube AI Moral Status 2025-10-30T20:4… ♥ 3
Coding Result
DimensionValue
Responsibilitynone
Reasoningmixed
Policynone
Emotionindifference
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgzczosQYWlhu4gNJCl4AaABAg.AOv-s3FOmrWAOvlatiK76M","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugx3nSuDFDjpcBaDBdF4AaABAg.AOv-oK_sjhJAOv9g7gfeT1","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzUrlFSrmKEOxF9n-N4AaABAg.AOv-_TI2mTXAOv80RLGal7","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzUrlFSrmKEOxF9n-N4AaABAg.AOv-_TI2mTXAOv8HYMTXbX","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzUrlFSrmKEOxF9n-N4AaABAg.AOv-_TI2mTXAOvA2T7sLay","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzUrlFSrmKEOxF9n-N4AaABAg.AOv-_TI2mTXAOvAhqaz2Go","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_UgzUrlFSrmKEOxF9n-N4AaABAg.AOv-_TI2mTXAOvDOnLnfqS","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_Ugy-H-lkhzRZ5AlKyL94AaABAg.AOv-O0chSbaAOwasIs7OId","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwmIejXdDmc3nz1Zy54AaABAg.AOv-G3_fRc9AOwR7xdoXeU","responsibility":"government","reasoning":"contractualist","policy":"regulate","emotion":"approval"}, {"id":"ytr_UgwYSjrR-3YQGIB4WPl4AaABAg.AOv-FgYG3tmAOv3XHim0r5","responsibility":"developer","reasoning":"deontological","policy":"regulate","emotion":"outrage"} ]