Raw LLM Responses

Inspect the exact model output for any coded comment.

Comment
It is completely ridiculous to think of current level true AI as remotely threatening though. The actual AI out there is frankly still stupid, unable to really think for itself. Even supervised machine learning is largely based upon randomness and not clever decision making and prediction. And then we haven't talked about how current AI can not rewrite itself, code it's very own code to expand itself etc.. The suggestion private AI focused companies are ahead of the military is also mostly just plain false. Also, the 'ripsaw' vehicle is not a real part of current day warfare. It's a thing that never was widely adopted. Despite a prototype being used in Iraq to see how it could be effective. Pretty sure it never was autonomous back then though. Last I heard it actually lost interest when the Black Knight vehicle by BAE Systems was developed. And even that project seems to be on a dead end road, as it is seemingly impossible to make the wireless communications work safely, without the potential for it being hacked and control taken over. Remember the hacked spy drones? Yeah, that type of stuff. Honestly, it would very much surprise me if any of the bigger military would really allow for more autonomous vehicles to participate in warfare. It is also one thing to replace a pilot or boat crew, but a whole other thing to allow an autonomous vehicle to fire at it's own 'will'. There's a reason why drone pilots are still humans, sitting in a container at the other end of the world deciding when to fire etc. In terms of whether or not a computer would be the more capable pilot than a human; of course it is! It has a direct feed of sensors, it has basically instant knowledge of what is about to happen. No surprise there.
youtube 2018-04-07T16:2…
Coding Result
DimensionValue
Responsibilitynone
Reasoningconsequentialist
Policynone
Emotionindifference
Coded at2026-04-27T06:24:59.937377
Raw LLM Response
[ {"id":"ytc_UgymU-_jZ6AYNzHLXyl4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_UgwoJMQsJK4l7_qTYY14AaABAg","responsibility":"none","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugy1opUJZPpTaCzixfV4AaABAg","responsibility":"user","reasoning":"consequentialist","policy":"none","emotion":"approval"}, {"id":"ytc_Ugwddz6KSwtidh3OXJ94AaABAg","responsibility":"company","reasoning":"consequentialist","policy":"regulate","emotion":"fear"}, {"id":"ytc_UgyBgsq51EU3Ab9SDiZ4AaABAg","responsibility":"distributed","reasoning":"deontological","policy":"ban","emotion":"fear"}, {"id":"ytc_UgwQKXEkh_sVhciDqrJ4AaABAg","responsibility":"ai_itself","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_Ugw7iGQSIGqblk8SSMh4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"outrage"}, {"id":"ytc_Ugybg98pKjriV4QjR2h4AaABAg","responsibility":"none","reasoning":"mixed","policy":"none","emotion":"resignation"}, {"id":"ytc_Ugx5hRf__vHqBl_Hxjt4AaABAg","responsibility":"government","reasoning":"consequentialist","policy":"none","emotion":"indifference"}, {"id":"ytc_UgzuorSzfY1wjp2W3dx4AaABAg","responsibility":"distributed","reasoning":"consequentialist","policy":"liability","emotion":"fear"} ]