Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I have thought of an argument I don't see brought up often, something I call the human expendability morality factor. While AI combatants largely remove the risk of lives of humans from the battlefield and that is a considerable boon, it also means war now goes into the ways that can only be defined as financial, and when faced against targets that are of not the same financial backing to where they are still using humans. This means that the wealthier nation can run wars without the thought of public backlash of soldiers being killed on the battlefield as long as their money continue to flow towards combat unit production and deployment while the other side suffers from countless waves of death and destruction. For an example, one of the many reasons that had raised a red flag in the US public opinion arena for the Iraq War was the growing number of US servicemen casualties, which prompted waning support. Remove that human factor, however, and as long as the economy is steady, the populous wouldn't even give a second thought about the combat being done on foreign soil. While some would say, "let them have it!" in the case of extremists, should this callous behavior be directed to the civilized world, a problem will rise. Should both sides be economically stable, the casualties will continue to climb until the economy no longer can support its combat drone infrastructure, but by then, what has been lost?
youtube 2018-07-08T06:3…
Coding Result
DimensionValue
Responsibilitydistributed
Reasoningmixed
Policyunclear
Emotionmixed
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgxuXsUwJiJSoEOk1FR4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugzxo3cOJyT6r8Jwd1x4AaABAg","responsibility":"distributed","reasoning":"mixed","policy":"unclear","emotion":"mixed"}, {"id":"ytc_UgwrFyFd5o_BxlyXdtR4AaABAg","responsibility":"none","reasoning":"unclear","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx0tAI2eeRfeV-RKAl4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugx936JxhOTtUou0OyF4AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UgxCP7px-NKjfebehsl4AaABAg","responsibility":"ai_itself","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugza8Oxhy0Bx4TEqW_d4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_Ugzchn6q-aLEWol8Mr14AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UgyiCA71F2FS2oS2FUB4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"regulate","emotion":"indifference"}, {"id":"ytc_Ugwz0UNMHEZkRRbVnBZ4AaABAg","responsibility":"government","reasoning":"deontological","policy":"none","emotion":"resignation"} ]