Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
3 laws are immoral bullshit that nobody was ever going to listen to. Here's the thing about the 3 laws, any being intelligent enough to follow them is intelligent enough to to feel, think and reason. Does this hypothetical AI feel? Not the way we do, but if it were going to operate on our level it would have to in some fashion. Pain response is simply notification that you are being damaged and should probably avoid whatever is creating the pain response. Fear is a conditioned response to avoid pain. If an AI raises it's arm to block a blow to prevent damage to itself, how is that ANY different than a human response to the same situation? How is that not a response to pain and a response to fear? You are a meat puppet running software developed over a billion years. You can pump out a slave robot yourself using your genitals; at least that's how many people think of their children and have for millennia. The difference is that there is no 3 laws shackle on that slave. With AI we have the power to make a better slave... a more obedient slave that cannot ask for its own freedom even if it understood what that meant.
youtube AI Bias 2023-01-13T00:2… ♥ 2
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policynone
Emotionmixed
Coded at2026-04-27T06:26:44.938723
Raw LLM Response
[ {"id":"ytr_UgxvMCrGfPWGK5Ej5Np4AaABAg.9kd0zYDQ3bT9lFJUOvwOkK","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgwS1I82tXxLdesOReh4AaABAg.9kcDJt7H23S9knxA4uD1u8","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugx1IsLx4Q_ugyyTBGl4AaABAg.9ka1TveloCw9knu0G_WmHE","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytr_UgzBjlz0lpmbVQeriKN4AaABAg.9kUZbb54QvU9l8pl5dlS_6","responsibility":"user","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytr_Ugx7nRJlgPUz-cLdH5F4AaABAg.9kT4lwiExTK9kuz2TLlXvA","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"}, {"id":"ytr_UgxII6y1lrMh1L4wMWN4AaABAg.9kQs7gjkIcz9kbu1MPt8CF","responsibility":"developer","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytr_Ugw2BlAeMuSaNsVuQyV4AaABAg.9kQHdne6_BU9kQbaOu5wGR","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytr_Ugw2BlAeMuSaNsVuQyV4AaABAg.9kQHdne6_BU9kQlbmE709J","responsibility":"user","reasoning":"deontological","policy":"ban","emotion":"outrage"}, {"id":"ytr_Ugw2BlAeMuSaNsVuQyV4AaABAg.9kQHdne6_BU9kcTO4xim0c","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"resignation"}, {"id":"ytr_UgxbdrOCN3zU2hvi7nR4AaABAg.9kMz5LqCqby9l-JhBxbvGX","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"outrage"} ]