Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
The A.I systems would probably still get orders from humans. No government would risk having an A.I system that just does what it wants.
youtube 2022-03-20T18:2…
Coding Result
DimensionValue
Responsibilitygovernment
Reasoningunclear
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_UgxXY1A1jB1RwuOhmTN4AaABAg.9gQBCkCK7Cd9uAjGtMeUtm","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyXvOkouEfwO_U0wvh4AaABAg.9Zln4KNMEfe9ZnzvganPAj","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgwbV_FSiG4KcX6TVEV4AaABAg.9YHhhmDzbkU9Zo-3rckqbA","responsibility":"government","reasoning":"unclear","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgyoGNtEJG_8qhOlEbB4AaABAg.9Xm91E0VQzj9nbinY_9OOr","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"resignation"}, {"id":"ytr_UgxnMP_6vZ3HHi0jTJt4AaABAg.9Q_k7KHtBrY9Q_n9C6apOe","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwScv2WH9lqjQNscEV4AaABAg.9Q55hprlhym9Q8svJjCvBR","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyF-FYilKZnSJt-vlB4AaABAg.9Q3ZrWSSZnD9Qmc5-E14rP","responsibility":"user","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgyobqgoESh_kyIlhrx4AaABAg.9Q-qnIpXeO19Tlk9XkcAlv","responsibility":"user","reasoning":"deontological","policy":"unclear","emotion":"indifference"}, {"id":"ytr_UgxfJdxh82VFqzLi7j14AaABAg.9PUlJucnC4T9PUvivVwTdm","responsibility":"company","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_UgwyuYpc9UsJCgNig1x4AaABAg.9PRXs3NgrZt9T6l1VKX94A","responsibility":"developer","reasoning":"deontological","policy":"unclear","emotion":"indifference"} ]