Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I'm think that if robot have consciousness they would already have the understanding of what we are talking about and they might don't care unless the robot was told to not care and destroy everything. But in that case it's like a nuclear bomb.
youtube AI Moral Status 2021-02-10T12:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policynone
Emotionfear
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugyc7iVFUnlvOXBGU6l4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx7HbBXeenVQQCcub94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxeA7-O_aOfbcsmgP14AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgzRs2CZ8ylrG3el-SB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzV4RRzZTZ5CQnumu54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugwbt_rTc6HqdFD4FDx4AaABAg","responsibility":"distributed","reasoning":"contractualist","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyNLABXwwMCSejWi-94AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugyi6JkMBrfnY-LLLa14AaABAg","responsibility":"none","reasoning":"deontological","policy":"regulate","emotion":"approval"}, {"id":"ytc_UgxLDWG0S8en94uHLQZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"mixed"}, {"id":"ytc_Ugw85IDHzW72BMbuInx4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]