Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
"I'd rather see machines on the field than people." This interviewer doesn't even understand the issue: Things which exterminate humans are no threat to the machines. Nuclear fallout? No problem. Bio-weapons? No problem. The AI soldiers will err on the side of exterminating humans by accident as a side-effect of the weapons used.
youtube 2026-03-09T18:3…
Coding Result
DimensionValue
Responsibilityai_itself
Reasoningconsequentialist
Policyban
Emotionoutrage
Coded at2026-04-27T06:24:53.388235
Raw LLM Response
[ {"id":"ytc_UgzpRQl_IL1_YbQwge54AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"fear"}, {"id":"ytc_UgwQ65KoF5dL3pTBjg54AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgyBDqtwtIoqv0yfviR4AaABAg","responsibility":"developer","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugz9aBtSkJ2Ir_Lhbsp4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyIgPCXiWfupeUEsuF4AaABAg","responsibility":"company","reasoning":"deontological","policy":"none","emotion":"outrage"}, {"id":"ytc_UgxxUe1k-KPvMIvnTU54AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"ban","emotion":"outrage"}, {"id":"ytc_UgwWzHzdbNpVHa0PRh14AaABAg","responsibility":"company","reasoning":"deontological","policy":"liability","emotion":"fear"}, {"id":"ytc_UgyEdWhSa8zSkKD3l0F4AaABAg","responsibility":"company","reasoning":"virtue","policy":"none","emotion":"mixed"}, {"id":"ytc_UgyhcYxggIH0qjW2MKN4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"indifference"}, {"id":"ytc_UgxMtGzdLdJd47aWb3J4AaABAg","responsibility":"company","reasoning":"contractualist","policy":"regulate","emotion":"approval"} ]