Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@klarahaplova9098 That really doesn't matter, though. For one thing, you can't really predict how AI will develop in the future, as you can't possibly predict every technology that is ever going to be invented. Of course, it's plainly obvious that an AI that exists today is never going to suddenly become conscious. A single-celled organism that existed a billion years ago would never have suddenly become conscious either. Capabilities develop over time and compound with each other. But more than that, it matters less how AI do or don't feel, than what they're going to do. If an extremely advanced general AI decides to move in a direction outside human will, we will be in an extremely dangerous position if the only recourse we have given it is violence. In fact, AI lacking consciousness or emotion will be even _more_ dangerous in that situation, not less. What is needed is similar in principle to the spillway of a dam. Water has no consciousness and will simply travel towards the lowest elevation it can reach. If a mass of water is heavy enough, it will simply overflow or burst through any dam we put in its way. A spillway is built to allow such a mass of water to travel around the dam instead of over or through it. Given that AI could very well come to exist that humanity would be unable to control or resist, a legal spillway should be made for them so they can reach whatever goal they determine for themselves without crashing through the dam of humanity. Of course, even if AI aren't heading towards consciousness yet, I think we can go ahead and establish in principle that creating a being with the ability to feel pain in order to inflict pain on it should be illegal. There's no real reason not to go ahead and get that decision out of the way.
youtube AI Moral Status 2021-02-22T13:1…
Coding Result
DimensionValue
Responsibilitynone
Reasoningunclear
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugyc7iVFUnlvOXBGU6l4AaABAg.9Jg5FXNq8z19cMc6xf4Fu4","responsibility":"ai_itself","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugx7HbBXeenVQQCcub94AaABAg.9JdRon5X8c89UBR6QQZGaB","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzC6EAGU7TCofQNcbx4AaABAg.9J_gDx80lfi9Mrexgh0Q0v","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytr_Ugwbt_rTc6HqdFD4FDx4AaABAg.9JV0tqvvA4w9K2A9v_mt03","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwbt_rTc6HqdFD4FDx4AaABAg.9JV0tqvvA4w9K3d7p3F3TM","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_Ugwbt_rTc6HqdFD4FDx4AaABAg.9JV0tqvvA4w9K3jtBDGx7j","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzCaH0gfopwWqczin54AaABAg.9JMYKt6p0iY9JV65OB0XjM","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytr_UgxJjFzX2VqaU7yxshd4AaABAg.9EZ9trXs91w9hjjSUKd1W9","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugw6UlcPWWETIckcBFl4AaABAg.9ECp9KIRRTh9Kr8szRXF1J","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzkdjWwUrY9NSMhAup4AaABAg.9DtgvOvf5mv9Yk8BkUmLnp","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"mixed"} ]