Let's start with the weaponry porn: Here is a Boeing-produced video of an unmanned F-16 fighter jet evading a ground-to-air missile, and it's every bit as awesome as you would expect.
While the point of the exercise was to see how the plane functioned in a surveillance role, the key line is: "It performs unmanned just like it does manned." And that's something to think carefully about as we move beyond the current generation of pilotless aircraft. The remotely controlled Predator drone is an astonishing piece of military technology, but compared with the next-generation F-35 fighter, it's just a toy airplane. And let's face it, in a real combat scenario, there will be no way to remotely pilot an unmanned fighter jet -- it's going to have to be able to make kill-or-be-killed decisions on its own.
The slippery slope is pretty obvious: From the Spartoi protecting the Golden Fleece to the Droid Army of the (three worst) "Star Wars" movies, men have dreamed of war without people, and all the expense and suffering it would avoid -- for the side that has the robots, anyway. Advances in human-controlled remote warfare -- "dumb" devices such as the Predator and the Talon bomb-disposal robot made famous in "The Hurt Locker" -- are equally steps toward the smart machines of the future.
In fact, the future is upon us. Earlier this year, the Russian military announced that sentry duties at five ballistic missile sites would be taken over by a "mobile robotic complex" (video highly recommended) with the ability to "detect and destroy targets without human involvement." Last year, a U.S. Navy drone landed itself on an aircraft carrier -- a task deemed too tricky for a human with a joystick. South Korea has experimented with Samsung-made robotic gun towers along the demilitarized zone that can detect and then fire on human intruders. (It politely gives the targets an audible warning before launching grenades their direction.)
The Pentagon being the (sequestered) bureaucracy that it is, the lure of robot armies is as much about the bottom line as the casualty list. In January, General Robert Cone, then the head of the Army's Training and Doctrine Command, said that because "people are our major cost" the service was considering shrinking the typical brigade from to 3,000 troops from 4,000, with the slack to be picked up by "robots or manned/unmanned teaming." But these are support roles, not combat ones.
Likewise, the Air Force tends to tamp down talk about drones pressing their own "fire" buttons, and seems to think the most pressing use for artificial intelligence technology is packing up all the junk the Army makes it lug around -- behold the robopallet. In 2012 the Defense Department adopted a go-slow policy directive on all autonomous and semi-autonomous weapon systems that former New York Times editor Bill Keller saw as a somewhat leaky "10-year moratorium ... while it discusses the ethical implications and possible safeguards."
That's what the military dishes out for public consumption, anyway. Meanwhile, plenty of its best minds are envisioning the age of tactically autonomous systems. One, Captain Michael W. Byrnes, explained in the Air Force's Air & Space Power Journal that "A tactically autonomous aircraft ... need not seek science-fiction-like self-awareness; within the scope of air-to-air combat, it is an airborne computer that executes the underlying mathematical truths of what human combat pilots do in the cockpit, doing so more quickly and with more precision." (Also, Byrnes insists we need not fear the Cylon apocalypse, which is comforting.) Nor is the military ignoring the ethical ramifications: Last year the Navy announced $7.5 million in grant money for academics to "explore how to build a sense of right and wrong and moral consequence into autonomous robotic systems."
Some people think this is all for the best: "If a drone's system is sophisticated enough, it could be less emotional, more selective and able to provide force in a way that achieves a tactical objective with the least harm," Samuel Liles, a professor in the Cyber Forensics Laboratory at Purdue University, told Joshua Foust in National Journal. "A lethal autonomous robot can aim better, target better, select better, and in general be a better asset."
If you find that terrifying, you are not alone.
To contact the writer of this article: Tobin Harshaw at firstname.lastname@example.org.
To contact the editor of this article: Zara Kessler at email@example.com.