Pentagon Official’s Report Brings Light to Real-World Dangers of Autonomous Weapons

Brittany Charles

The Terminator will be real during our lifetime. Perhaps not the Terminator, but due to low cost sensors and artificial intelligence the concept of autonomous weapons that act without human intervention is becoming a reality[1]. Weaponry capable of targeting and killing completely free of human intervention, while not available in the US, are appearing in military arsenals throughout the world[2].

According to the N.Y. Times, such operations are controversial. This is because although initially a human operator selects a target, some of these systems are designed to operate (at times over hundreds of miles) out of the control of the operator, identify and then attack a target[3]. The technology can be utilized in various weaponry systems including: robots, missiles, stationary weaponry systems and drones[4].

So why is a pentagon official reporting that low cost weaponry capable of completing military actions without human intervention dangerous? These weaponry systems are completely autonomous. According to Paul Scharre, one of the authors for the 2012 Defense Department directive report, “Having a person in the loop is not enough…the human has to be actively engaged.”[5] Furthermore, these systems are capable of being hacked, spoofed or manipulated by adversaries. Kind of like the Terminator. Perhaps a weapon that can decide how to complete military objectives and is additionally capable of being influenced by others is a weapon we might want a little more control of?

[1] John Markoff, Report Cites Dangers of Autonomous Weapons, N.Y. Times (Feb. 28, 2016), http://www.nytimes.com/2016/02/29/technology/report-cites-dangers-of-autonomous-weapons.html?_r=0.

[2] Id.

[3] Id.

[4] Id.

[5] Id.