Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
@revantair8497 "But if a computer can become sentient and than ask for rights, you should give them to him, since he is... you know... sentient." You never specified that they need be sentient in your original comment, you just said when the robot asks for a right, give it to them so that's why I answered in the way I did. But this is not so simple a thing anyway, which is my point. How do we know when the computer is sentient? It's much more simple for us to know when our animal relatives are sentient for the combination of two reasons A. we are sentient ourselves and know how it manifests in us and B. we have a similar biological makeup to our animal relatives that are sentient, and thus we can much more easily know/observe sentient in DNA-based animal lifeforms. Also, when I said 'program them that way' was not meant to be taken too literally for the real world application. It was meant more along the lines of, what if we create an AI smart enough that it can learn / try to ask for rights even if it is not sentient/conscious.
youtube AI Moral Status 2018-12-19T05:0… ♥ 1
Coding Result
DimensionValue
Responsibilityunclear
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugx631C12qaWO8ZV5vN4AaABAg.8oilXFhy_8T8ovhdWgb099","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_Ugx631C12qaWO8ZV5vN4AaABAg.8oilXFhy_8T8p17Uk6zn1G","responsibility":"unclear","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugx631C12qaWO8ZV5vN4AaABAg.8oilXFhy_8T8p2LV0EogqD","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"approval"}, {"id":"ytr_Ugx631C12qaWO8ZV5vN4AaABAg.8oilXFhy_8T8qB1f3R0xuj","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzlAw5-SxzqVvryHBB4AaABAg.8ocQseAttak8pr4x8-CP2T","responsibility":"ai_itself","reasoning":"unclear","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugy6dv4H4WoJH49oJyp4AaABAg.8o6_LvC0pnL8sZVmpbCWR0","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytr_Ugy6dv4H4WoJH49oJyp4AaABAg.8o6_LvC0pnL8tVLOvR_Y6G","responsibility":"unclear","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytr_Ugx-g4uTofvR0PeMO9R4AaABAg.8nzjlsd1zVm8ojy1ARNXKb","responsibility":"distributed","reasoning":"consequentialist","policy":"unclear","emotion":"outrage"}, {"id":"ytr_UgzfCmmL1Obj07g8Nj94AaABAg.8nqyNymOL6s8pZQo_ipo86","responsibility":"unclear","reasoning":"unclear","policy":"unclear","emotion":"mixed"}, {"id":"ytr_Ugx2df_rpyC7-9zbkWZ4AaABAg.8msCnqUOznm8o9Us075EJJ","responsibility":"ai_itself","reasoning":"deontological","policy":"none","emotion":"resignation"} ]