Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Human control is arguably the most dangerous aspect of an autonomous weapon system. You could program a robot to follow a moral code, the geneva conventions, and to minimize civilian losses. A human sitting at the control center thousands of kilometers away will probably not hesitate to sacrifice some innocent lives for "the greater good". Don't forget that an A.I can't be evil if you don't program it to be, while there are plenty of horrible humans.
youtube 2022-03-20T18:2…
Coding Result
DimensionValue
Responsibilityuser
Reasoningdeontological
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxXY1A1jB1RwuOhmTN4AaABAg.9gQBCkCK7Cd9uAjGtMeUtm","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyXvOkouEfwO_U0wvh4AaABAg.9Zln4KNMEfe9ZnzvganPAj","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwbV_FSiG4KcX6TVEV4AaABAg.9YHhhmDzbkU9Zo-3rckqbA","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyoGNtEJG_8qhOlEbB4AaABAg.9Xm91E0VQzj9nbinY_9OOr","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgxnMP_6vZ3HHi0jTJt4AaABAg.9Q_k7KHtBrY9Q_n9C6apOe","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwScv2WH9lqjQNscEV4AaABAg.9Q55hprlhym9Q8svJjCvBR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyF-FYilKZnSJt-vlB4AaABAg.9Q3ZrWSSZnD9Qmc5-E14rP","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyobqgoESh_kyIlhrx4AaABAg.9Q-qnIpXeO19Tlk9XkcAlv","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxfJdxh82VFqzLi7j14AaABAg.9PUlJucnC4T9PUvivVwTdm","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwyuYpc9UsJCgNig1x4AaABAg.9PRXs3NgrZt9T6l1VKX94A","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"} ]