Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
i can accept full ai in a few cases. 1) if the ai was flawless, and remote upgradeable or killable. put an ai where lets say "anyone with a weapon will be killed on sight" than anyone with a weapon or a weapon looking device will be killed regardless of age sex or intent. no flaw that would auto kill individuals without weapons and so forth. other than that, i believe a human operator should always be behind a drone, no matter how far ai goes.
youtube 2012-11-24T00:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningdeontological
Policyliability
Emotionapproval
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgwiBO59xpLPCkecBqd4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy_uEjQogI-wb-bmPB4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugx7nuMjJl6i0N6vTmZ4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugys3yBMDKkxZIDT7cd4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"}, {"id":"ytc_UgxNKwh_r49bvXQyVH94AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxH2eMZsO_x_AjYfy94AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxsRrYrWDDcgzPrJQN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgwcPYYDaK_--Y12hiN4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgxXAQENEamBTwM_Aht4AaABAg","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugykgw-hR28QCN8z1ep4AaABAg","responsibility":"ai_itself","reasoning":"deontological","policy":"liability","emotion":"approval"} ]