Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
Killing others requires an emotional motive. Computers don't lust for control because they don't have the human frailties of pride, arrogance, greed. If AI is ever programmed with these human frailties, then I would worry.
youtube AI Governance 2025-06-25T10:0…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningvirtue
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_Ugz_ybgufAgnJUsJET94AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"indifference"}, {"id":"ytc_UgyLS8_5xfZCQcIc-CV4AaABAg","responsibility":"developer","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxWl3LU_ffv37wzbbl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"approval"}, {"id":"ytc_UgyDZIMb9GLD4ojZlbl4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwdKano-bUypol0tCN4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgxSoqudLLkTzV3L0WV4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"approval"}, {"id":"ytc_UgxDVk7UCNXHu8zbLGx4AaABAg","responsibility":"ai_itself","reasoning":"mixed","policy":"none","emotion":"fear"}, {"id":"ytc_UgyzK-Cz9MXYz7aXdpV4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwBrY80RNrkaehqgXx4AaABAg","responsibility":"developer","reasoning":"virtue","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgzR9MPvUZzYT8z9V114AaABAg","responsibility":"company","reasoning":"unclear","policy":"none","emotion":"mixed"} ]