Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Or maybe your idea of coexistence is a human morality and ai wouldn't have such a morality in achieving its goals
youtube AI Governance 2023-07-30T10:1…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyunclear
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytr_Ugwa_pEu-NGXMtxlHnV4AaABAg.9v5BalJMwKv9ww9P64yNnh","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx5rp2nNO24Wyhx7Ax4AaABAg.9v1xxXNYY7D9y0yIbtTW9C","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytr_UgzUKGoRDmhtLT1mobN4AaABAg.9urNVOYwvOP9urbnbkmUu1","responsibility":"user","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_UgxSbuKA4gtHjZ8f25x4AaABAg.9upGYb5_YkE9wRXvjqeEe2","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugz6anPoXkBcAS2AAkZ4AaABAg.9uIicPT3p_M9uKuGS--yLQ","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytr_Ugx89fmJqE0WW-pQROF4AaABAg.9tVokbG-Yo69tVu-u44pw2","responsibility":"unclear","reasoning":"consequentialist","policy":"unclear","emotion":"fear"}, {"id":"ytr_Ugx89fmJqE0WW-pQROF4AaABAg.9tVokbG-Yo69tVuh2JHdZ5","responsibility":"distributed","reasoning":"virtue","policy":"none","emotion":"resignation"}, {"id":"ytr_Ugxnh_0ObDi9fP3KkVx4AaABAg.9sqYTNFSDam9vcZfc0WONZ","responsibility":"developer","reasoning":"deontological","policy":"liability","emotion":"outrage"}, {"id":"ytr_Ugy1GC0FuB6IksRJiAx4AaABAg.9smmNqIQPG89smqX-LAUjk","responsibility":"ai_itself","reasoning":"consequentialist","policy":"unclear","emotion":"indifference"}, {"id":"ytr_Ugyof9gBmUYi_Sin8jh4AaABAg.9shyiTPHsh39shzOyjaSJt","responsibility":"ai_itself","reasoning":"consequentialist","policy":"regulate","emotion":"fear"} ]