Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
I am sorry, but on a unit based level in a military context it is *UNAVOIDABLE*. *Why?* Remote controlled drones can be electronically jammed by other i-nations. For example a interceptor drone meant to shoot down other drones cant be remote operated. Because the first defense of another drone would then be to jam signals etc in order to make the interceptor drone incapable of getting input from the remote user. Also, lets not confuse these robots with AI ala the Terminator. Their independence will be insect based at best, which is why you will first see it in the air where there are allot less obstacles or similar to get "stuck on" and the "if - else" type coding will be sufficient. If you want to play an analogy of the first types of independent combat drones out there you can actually do so,... kinda, by checking out the age old PS1 game Carnage Heart. That is how "smart" those drones will be and how they principally will operate. However, having a "Skynet" like machine in control of nukes or worse is fully avoidable. Its ironic, but the biggest worry about automation with robots is the jobs question. Current day economic and Keynesian based Capitalism wont work if most people do not have a chance at getting a job. And without a critical mass of people earning a living it will also affect demand which, and here comes the ironic part, might even mean the robots lose their jobs.
youtube 2015-07-30T19:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningdeontological
Policynone
Emotionresignation
Coded at2026-04-26T23:09:12.988011
Raw LLM Response
[ {"id":"ytc_UggLZ6M2z5JqTngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_Ugi3e0GA4HfH8HgCoAEC","responsibility":"none","reasoning":"virtue","policy":"none","emotion":"approval"}, {"id":"ytc_Ugizge_QLY4xw3gCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"fear"}, {"id":"ytc_UghHaxdOpGagangCoAEC","responsibility":"user","reasoning":"deontological","policy":"none","emotion":"indifference"}, {"id":"ytc_UgieDwB_j4qUKngCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"ban","emotion":"fear"}, {"id":"ytc_UgjvbmAc83_c83gCoAEC","responsibility":"developer","reasoning":"consequentialist","policy":"industry_self","emotion":"indifference"}, {"id":"ytc_Ugia_nrfbV5-d3gCoAEC","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_UggRCZjvTN6Mg3gCoAEC","responsibility":"none","reasoning":"deontological","policy":"none","emotion":"resignation"}, {"id":"ytc_UgjRoWWlA3PONXgCoAEC","responsibility":"ai_itself","reasoning":"consequentialist","policy":"liability","emotion":"fear"}, {"id":"ytc_Ugh0VoxfRhV-tXgCoAEC","responsibility":"distributed","reasoning":"mixed","policy":"none","emotion":"mixed"} ]