As a little kid I worried a lot, about everything from marauding robot armies laying waste to Americans with death rays (which I saw in a terrifying old sci-fi movie called Target Earth) to the prospect of starving hordes of old people ravaging the countryside after Social Security broke down (which I heard in an even more terrifying Barry Goldwater campaign speech). My father’s inevitable response to my regular heebie-jeebie attacks was, “Don’t worry, it’ll never happen.”
Good as my dad was at most things, his talents for prophesy were limited. Actuarial tables long ago blew the whistle on Social Security, a Ponzi scheme that starts to totter when population increases flatten out. And the killer robots aren’t just in our future, they’re already here, as I learned at a weekend conference on the ethical and legal implications of advancing robot technology.
My father was right about one thing: The military robots aren’t carrying the colors of the Venus Interplanetary Expedition forces, but those of the U.S. Army. The Pentagon already has, by its own count, $20 billion of robots in uniform, doing everything from reconnaissance missions to clearing land mines and booby traps.
Those assignments may sound relatively benign as martial arts go. But cruise missiles, which locate and navigate to targets on their own after being launched by humans, are a species of robot. So are the missile-firing drone aircraft that roam the skies of Afghanistan and Pakistan, blowing up suspected terrorists. They’ll soon have company on the ground. Great Britain’s QinetiQ Group is marketing a robot tank that packs a 7.62 mm machine gun and a four-barreled grenade launcher. Another machine-gun-equipped robot tank made by Samsung is already patrolling South Korea’s northern border.
The Samsung tank doesn’t open fire unless a human operator back at headquarters tells it to — but it could. It’s equipped with heat and motion sensors that enable it to identify human targets and shoot them. The same is true, or soon will be, for most of the other weapons. The University of Ottawa law school’s Ian Kerr and Katie Szilagy, in a paper delivered at the conference, said that more than 40 countries are developing so-called autonomous weapon systems in which machines rather than humans will deliver “targeting instructions and even decisions about whether and when to pull the trigger.”
And what general wouldn’t want them? Robots are, in many ways, better soldiers than humans. They don’t get tired, hungry or have to go the bathroom — drone aircraft can keep their eyes on a target for as much as 30 hours without refueling. They don’t panic when the robot next to them gets shot; their sensors cut right through the fog of war. Because they don’t have to be fed, clothed, hospitalized or paid pensions, they’re cheaper. Best of all, as Rutgers law professor Richard M. O’Meara (a retired U.S. Army brigadier general) noted, when a robot gets killed, nobody has to write a letter to the parents.
Much was made at the symposium of the figurative inhumanity of handing war over to machines. But it seems to me war, in some ways, might be more humane if fought by robots. Untainted by human emotions, robot soldiers wouldn’t rape thousands of German women the way Russian soldiers did as they advanced on Berlin at the end of World War II, or deliberately massacre scores of civilians in revenge for heavy casualties as U.S. troops infamously did at My Lai during the Vietnam war. Panelists at the symposium wondered if robots would really be able to tell a farmer’s hoe or hunting rifle from a soldier’s AK-47 before firing at somebody in a combat zone, ignoring the fact that human troops have trouble doing so right now.
The real danger of robot armies is not that they will replace human judgment, but that they’ll replace human blood. “It changes the politics of war in potentially terrible ways,” O’Meara said. “It’s easy to go to war when there are no body bags coming home.”
Politicians will likely find shooting a seductively easy alternative to talking when the only casualties (at least on their side) will be diodes and microchips. In fact, they already have. Since President Obama took office, the United States has launched more than 250 aerial drone attacks in Pakistan alone. The cost in Pakistani lives reaches at least into the hundreds, probably the thousands; their judge, jury and executioner was some CIA kid with a joystick.
Obama has been able to keep this process largely secret because there have been no American casualties to generate controversy or open prying eyes. But a Washington Post poll earlier this year showed a whopping 83 percent approval rate for his drone war, including 77 percent of self-identified liberal Democrats. The fault, dear Brutus, lies not in our robots but in ourselves.