Developereo
ELITE MEMBER
- Joined
- Jul 31, 2009
- Messages
- 14,093
- Reaction score
- 25
- Country
- Location
Unmanned drone attacks and shape-shifting robots: War's remote-control future - CSMonitor.com
By Anna Mulrine, Staff writer
posted October 22, 2011 at 1:30 pm EDT
washington; and kandahar, Afghanistan
In the shadow of a heavily fortified enemy building, US commanders call in a chemical robot, or what looks like a blob. They give it a simple instruction: Penetrate a crack in the building and find out what's inside. Like an ice sculpture or the liquid metal assassin in "Terminator 2," the device changes shape, slips through the opening, then reassumes its original form to look around. It uses sensors woven into its fabric to sample the area for biological agents. If needed, it can seep into the cracks of a bomb to defuse it.
Soldiers hoping to eavesdrop on an enemy release a series of tiny, unmanned aircraft the size and shape of houseflies to hover in a room unnoticed, relaying invaluable video footage.
A fleet of drones roams a mountain pass, spraying a fine mist along a known terrorist transit route – the US military's version of "CSI: Al Qaeda." Days later, when troops capture suspects hundreds of miles away, they test them for traces of the "taggant" to discover whether they have traversed the trail and may, in fact, be prosecuted as insurgents.
Welcome to the battlefield of the future. Malleable robots. Insect-size air forces. Chemical tracers spritzed from the sky. It's the stuff of science fiction.
But these are among the myriad futuristic war*fighting creations currently being developed at universities across the country with funds from the US military. And the future, in many cases, may not be too far off.
Engineering students at the Naval Postgraduate School in Monterey, Calif., for instance, are now experimenting with chemical taggants on unmanned aerial vehicles (UAVs) like the ones being used in Afghanistan. Sure, the shape-changing chemical robot that slips through cracks may be more Ray Bradbury than battlefield-ready. But the Pentagon, in its perpetual quest to find the next weapon or soldier-saving device – and with scientific assurances that it's possible – is already investing millions to develop it.
"We're not about 20 years, or 10 years, or even five years away – a lot of this could be out in the field in under two years," says Mitchell Zatkin, former director of programmable matter at the Defense Advanced Research Projects Agency, or DARPA, the Pentagon's premier research office.
The development of a new generation of military robots, including armed drones, may eventually mark one of the biggest revolutions in warfare in generations. Throughout history, from the crossbow to the cannon to the aircraft carrier, one weapon has supplanted another as nations have strived to create increasingly lethal means of allowing armies to project power from afar.
But many of the new emerging technologies promise not only firepower but also the ability to do something else: reduce the number of soldiers needed in war. While few are suggesting armies made up exclusively of automated machines (yet), the increased use of drones in Afghanistan and Pakistan has already reinforced the view among many policymakers and Pentagon planners that the United States can carry out effective military operations by relying largely on UAVs, targeted cruise missile strikes, and a relatively small number of special operations forces.
At the least, many enthusiasts see the new high-tech tools helping to save American lives. At the most, they see them changing the nature of war – how it's fought and how much it might cost – as well as helping America maintain its military preeminence.
Yet the prospect of a military less reliant on soldiers and more on "push button" technologies also raises profound ethical and moral questions. Will drones controlled by pilots thousands of miles away, as many of them are now, reduce war to an antiseptic video game? Will the US be more likely to wage war if doing so does not risk American lives? And what of the oversight role of Congress in a world of more remote-control weapons? Already, when lawmakers on Capitol Hill accused the Obama administration of circumventing their authority in waging war in Libya, White House lawyers argued in essence that an operation can't be considered war if there are no troops on the ground – and, as a result, does not require the permission of Congress.
"If the military continues to reduce the human cost of waging war," says Lt. Col. Edward Barrett, an ethicist at the US Naval Academy in Annapolis, Md., "there's a possibility that you're not going to try hard enough to avoid it."
Beneath a new moon, a crew pushes the 2,500-pound Predator drone toward a blacked-out flight line and prepares it for takeoff. The soldiers wheel over a pallet of Hellfire missiles and load them onto the plane's undercarriage. The Predator pilot walks around the aircraft, conducting his preflight check. He then returns to a nearby trailer, sits down at a console with joysticks and monitors, and guides the snub-nosed plane down the runway and into the night air – unmanned and fully armed.
The takeoffs of Predators with metronome regularity here at Kandahar Air Field, in southern Afghanistan, has helped turn this strip of asphalt into what the Pentagon calls the single busiest runway in the world. An aircraft lifts off or lands every two minutes. It's a reminder of how integral drones have become to the war in Afghanistan and the broader war on terror.
Initially, of course, the plan was not to put weapons on Predator drones at all. Like the first military airplanes, they were to be used just for surveillance. As the war in Iraq progressed, however, US service members jury-rigged the drones with weapons. Today, armed Predators and their larger offspring, Reapers, fly over America's battlefields, equipped with both missiles and powerful cameras, becoming the most widely used and, arguably, most important tools in the US arsenal.
Since first being introduced in Iraq and Afghanistan, their numbers have grown from 167 in 2002 to more than 7,000 today. The US Air Force is now recruiting more UAV pilots than traditional ones.
"The demand has just absolutely skyrocketed," says the commander of the Air Force's 451st Operations Group, which runs Predator and Reaper operations in Kandahar.
As their numbers have grown, so has the sophistication with which the military uses them. The earliest drones operated more as independent assets – as aerial eyes that sent back intelligence and dropped their bombs. But today the unmanned aircraft are integrated into almost every operation on the ground, acting as advanced scouts and omniscient surveyors of battle zones. They monitor the precise movements of insurgents and kill enemy leaders. They conduct "virtual lineups," zooming in powerful cameras to help determine whether a suspected insurgent may have carried out a particular attack.
"A lot of the ground commanders won't execute a mission without us," says the Air Force's commander of the 62nd Expeditionary Reconnaissance Squadron in Afghanistan.
Robots, too, have become a far more pervasive presence on America's fields of battle. Remote-control machines that move about on wheels and tracks scour for roadside bombs in Iraq and Afghanistan. Soldiers in the mountains of eastern Afghanistan carry hand-held drones in backpacks, which they assemble and throw into the air to scope out terrain and check for enemy fighters. In the past 10 years, the Pentagon's use of robots has grown from zero to some 12,000 in war zones today.
Part of the exponential rise in the use of UAVs and robots stems from a confluence of events: improvements in technology and America's prolonged involvement in two simultaneous wars.
There is, too, the prospect of more money for military contractors eyeing a downturn in future defense budgets. Today, the amount of money being spent on research for military robotics surpasses the budget of the National Science Foundation, which, at $6.9 billion a year, funds nearly one-quarter of all federally supported scientific research at the nation's universities.
Military officials also see in the new technologies the possibility of savings in an era of shrinking budgets. Deploying forces overseas can now cost as much as $1 million a year per soldier.
Yet the biggest allure of the new high-tech armaments may be something as old as conflict itself: the desire to reduce the number of casualties on the battlefield and gain a strategic advantage over the enemy. As Lt. Gen. Richard Lynch, a commander in Iraq, observed at a conference on military robotics in Washington earlier this year: "When I look at the 153 soldiers who paid the ultimate sacrifice [under my command], I know that 80 percent of them were put in a situation where we could have placed an unmanned system in the same job."
Drones, in particular, seem the epitome of risk-free warfare for the nation using them – there are, after all, no pilots to shoot down. Moreover, the people who run them are often nowhere near the field of battle. Some 90 percent of the UAV operations over Afghanistan are flown by people in trailers in the deserts of Nevada. In Kandahar, soldiers help the planes take off and land and then hand over controls to the airmen in the US.
"We want to minimize the [human] footprint as much as possible," says the 451st Operations Group commander at the Kandahar airfield, where the effects of being close to the war are clearly visible: The plywood walls of the tactical operations center are lined with framed bits of jagged metal from mortars that have fallen on the airfield over the years.
While the distant control of drones may well protect American lives, it raises questions about what it means to have people so far removed from the field of conflict. "Sometimes you felt like God hurling thunderbolts from afar," says Lt. Col. Matt Martin, who was among the first generation of US soldiers to work with drones to wage war and who has written a book – "Predator: The Remote-Control Air War Over Iraq and Afghanistan: A Pilot's Story."
Martin agrees that the unmanned aircraft no doubt reduce American casualties, but wonders if it makes killing "too easy, too tempting, too much like simulated combat, like the computer game Civilization."
It probably doesn't reassure critics that the flight controls for drones over the years have come to resemble video-game contollers, which the military has done to make them more intuitive for a generation of young soldiers raised on games like Gears of War and Killzone.
Martin knows what it's like to confront the dark side of war, even as he fought it from afar. During one operation, he was piloting a drone that was tracking an insurgent. Just after he fired one of the aircraft's missiles, two children rode their bicycles into range. They were both killed. "You get good at compartmentalizing," says Martin.
What worries critics is those who are too good at it – and the impact in general of waging war at a distance. Some fret about the mechanics of the decisionmaking process: Who ultimately makes the decision to pull the trigger? And how do you decide whom to put on the hit list – a top Al Qaeda official, yes, but is some petty but persistent insurgent a matter of national security?
As the US increasingly uses drones in its secret campaigns, questions arise about how much to inform America's allies about UAV attacks and whether they alienate local populations more than they help subdue the enemy, which the US has starkly, and almost weekly, confronted with its drone campaign in Pakistan.
From the US military's viewpoint, the drone war has been fantastically successful, helping to kill key Al Qaeda operatives and Taliban insurgents with a minimum of civilian casualties and almost no US troops put at risk.
Some even believe that the ethical oversight of drones is far more rigorous than that of manned aircraft, since at least 150 people – ground crews, engineers, pilots, intelligence analyzers – are typically involved in each UAV mission.
The issue of what's a minimum of civilian losses is, of course, subjective. In 2009, the Brookings Institution, a Washington think tank, estimated that the US drone war was killing about 10 civilians for every 1 insurgent in Pakistan. That may be far fewer casualties than would be killed with traditional airstrikes. But it is hardly comforting to the Pakistanis.
Moreover, the very practice of taking out enemy leaders or sympathizers could at some point, according to detractors, devolve into an aerial assassination campaign. When the US used a drone strike last month to kill jihadist cleric and American-born Anwar al-Awlaki in Yemen, President Obama hailed it as a "major blow" to Al Qaeda in the Arabian Peninsula. But some critics decried the killing of a US citizen with no public scrutiny.
Barrett, who is the director of research at the Naval Academy's Stockdale Center for Ethical Leadership, discusses with his students the prospect of whether UAVs make it easier to wage war if the government doesn't have to worry about a public outcry. "There are not the mass numbers of troops moving around and visible, so it could be easier to circumvent the oversight of Congress and, therefore, legitimate authority," he notes.
Others ask a more simple but practical question: What about the troops who conduct the UAV strikes from the Nevada desert – could they become legitimate targets of America's enemies at, say, a local mall, bringing the war on terror to the suburbs?
Some worry that the US is, in fact, placing too heavy a burden on its UAV troops. Despite warnings that "video-game warfare" might make them callous to killing, new studies suggest that the stress levels drone operators face are higher than those for infantry forces on the ground.
"Having this idea of a 'surgical war' where you can really just pinpoint the bad guys with the least amount of damage to our own force, there's a bit of naiveté in all that," says Maryann Cusinamo Love, an associate professor at Catholic University of America in Washington, D.C.
She says the powerful cameras on the drones allow pilots to see in "great vivid detail the real-time results of their actions. That is an incredible stress on them."
It is also, she argues, a "ghettoization of the killing function in war." However justified the military mission may be, she says, "You are still giving the most stressful job of war disproportionately to this one subset of people."
Nearly as long as militaries have existed, they have invented arms to keep their soldiers as far away from danger as possible. Some sound ridiculous, others terrifying, but most have raised questions of fairness in warfare.
During World War II, Japanese forces used the jet stream to launch paper "fire balloons" rigged with bombs meant to explode when they drifted over US soil. One such balloon discovered by an American family during a picnic in the Oregon woods resulted in the only deaths in the continental US caused by enemy hostilities in the war.
For their part, US scientists experimented with a form of bio-inspired warfare: a "bat bomb" that they planned to launch in parachute-rigged casings over Japan. They imagined fitting the bodies of tiny bats with incendiary bombs on timers. The theory was that the bats, once dropped, would roost in the eaves and attics of Japan's delicate wooden dwellings, setting off fires. The technology was successfully tested but scrapped when it was deemed too expensive by the Pentagon.
On the Western front, Germany was experimenting with a remote-control tank known as the Goliath. It used technology pioneered by an American who had demonstrated a remote-control boat years earlier at Madison Square Garden in New York City. When he tried to sell his technology to the US military, however, he was met with ridicule.
"He said, 'I've got this technology,' but they started laughing – they thought he was crazy," says Peter Singer, author of "Wired for War: The Robotics Revolution and Conflict in the 21st Century."
With the advent of the US wars in Iraq and Afghanistan, however, technology has once again rendezvoused with military necessity. A company called iRobot in Bedford, Mass., sent a prototype of its PackBot, which soldiers began using to clear caves and bunkers suspected of being mined. When the testing period was over, "The Army unit didn't want to give the test robot back," Mr. Singer notes.
While the use of robots that can detect and defuse explosives is growing exponentially, the next big frontier for America's military R2-D2s may parallel what happened to drones: They may be fitted with weapons – offering new fighting capabilities as well as raising new concerns.
Already, researchers are experimenting with attaching machine guns to robots that can be triggered remotely. Field tests in Iraq for one of the first weaponized robots, dubbed SWORDS, didn't go well.
"There were several instan*ces of noncommanded firing of the system during testing," says Jef*frey Jacz*kow*ski, deputy manager of the US Army's Robotic Systems Joint Project Office.
Though US military officials tend to emphasize that troops must remain "in the loop" as robots or drones are weaponized, there remains a strong push for automation coming from the Pentagon. In 2007, the US Army sent out a request for proposals calling for robots with "fully autonomous engagement without human intervention." In other words, the ability to shoot on their own.
"Let's put it this way," says Lt. Col. David Thomp*son, project manager of the Army's robotic office. "We've seen the success of unmanned air vehicles that have been armed. This [weaponizing robots] is a natural extension."
At the Georgia Institute of Technology in Atlanta, Ronald Arkin is researching a stunning premise: whether robots can be created that treat humans on the battlefield better than human soldiers treat each other. He has pored over the first study of US soldiers returning from the Iraq war, a 2006 US Surgeon General's report that asked troops to evaluate their own ethical behavior and that of their comrades.
He was struck by "the incredibly high level of atrocities that are witnessed, committed, or abetted by soldiers." Modern warfare has not lessened the impact on soldiers. It is as stressful as ancient hand-to-hand combat with axes, he argues, because of the sorts of quick decisions that fighting with modern technology requires.
"Human beings have never been designed to operate under the combat conditions of today," he says. "There are many, many problems with the speed with which we are killing right now – and that exacerbates the potential for violation of laws of war."
With Pentagon funding, Dr. Arkin is looking at whether it is possible to build robots that behave more ethically than humans – to not be tempted to shoot someone, for instance, out of fear or revenge.
The key, he says, is that the robot should "first do no harm, rather than 'shoot first, ask questions later.' "
Such technology requires what Arkin calls an "ethical adaptor," which involves following orders. Learning, he explains, is potentially dangerous when it comes to making decisions about whether to kill. "You don't want to hand soldiers a gun and say, 'Figure out what's right and wrong.' You tell them what's right and wrong," he says. "We want to do the same for these robotic systems."
The aim, says Arkin, is not to be perfect, "but if we can achieve this goal of outperforming humans, we have saved lives – and that is the ultimate benchmark of this work."
Other research into armed robots centers not so much on outperforming humans as being able to work with them. In the not-too-distant future, military officials envision soldiers and robots teaming up in the field, with the troops able to communicate with machines the way they would with a human squad team member. Eventually, says Thompson, the robot-soldier relationship could become even more collaborative, with one human soldier leading many armed robots.
After that, the scenarios start to become something more out of the realm of film studios. For instance, retired Navy Capt. Robert Moses, president of iRobot's government and industrial relations division, can envision the day of humanless battlefields.
"I think the first thing to do is to go ahead and have the Army get comfortable with the robot," he says. One day, though, "you could write a scenario where you have an unmanned battle space – a 'Star Wars' approach."
These developments raise questions that ethicists are just beginning to unravel. This includes Peter Asaro, who last year formed the International Committee for Robot Arms Control. He's grappling with conundrums like: What, to a machine, counts as "about to shoot me?" How does a robot make a distinction between a dog, a man, and a child? How does it tell an enemy from a friend?
Such things are not entirely abstract. An automated "sentry robot" now stands guard in the demilitarized zone between North and South Korea, equipped with heat, voice, and motion sensors, as well as a 5 mm machine gun. What if it starts firing, accidentally or otherwise?
Within their own ranks, military officials are asking themselves similar questions. In March, the Navy launched a program at its postgraduate school in Monterey that explores the legal, social, and cultural impacts of unmanned systems. "Are we going to give the ability to a robot for conducting a killing operation based on its own software and sensors?" asks retired Navy Capt. Jeffrey Kline, who is directing the new effort. "That rightly causes a lot of red flags."
In part, military officials feel they have to develop these new systems to stay ahead of America's enemies, many of whom will be creating their own versions of automated armies. Yet that could lead to what some consider a 21st-century arms race and encourage others to use the new weapons.
Late last month, federal authorities charged a Massachusetts man with plotting an attack on the US Capitol and the Pentagon using a large, remote-controlled aircraft filled with explosives. Earlier this year, Libyan rebels contacted Aeryon Labs Inc., a Canadian drone manufacturer, about buying a small unmanned helicopter. "Ultimately, I think they found us through Googling. That's how a lot of people find us," says Dave Kroetsch, Aeryon's president. Aeryon officials say they get inquires from militaries all over the world, which is one reason they have decided not to sell weaponized drones.
In the end, the emerging era of remote-control warfare – like evolutions in warfare throughout history – will likely create profound new capabilities as well as profound new problems for the US. The key will be to minimize the one over the other.
"There are many futures that can be created," says Georgia Tech roboticist Arkin. "Hopefully, we can create, I won't say a utopian, but at least not a dystopian one."
By Anna Mulrine, Staff writer
posted October 22, 2011 at 1:30 pm EDT
washington; and kandahar, Afghanistan
In the shadow of a heavily fortified enemy building, US commanders call in a chemical robot, or what looks like a blob. They give it a simple instruction: Penetrate a crack in the building and find out what's inside. Like an ice sculpture or the liquid metal assassin in "Terminator 2," the device changes shape, slips through the opening, then reassumes its original form to look around. It uses sensors woven into its fabric to sample the area for biological agents. If needed, it can seep into the cracks of a bomb to defuse it.
Soldiers hoping to eavesdrop on an enemy release a series of tiny, unmanned aircraft the size and shape of houseflies to hover in a room unnoticed, relaying invaluable video footage.
A fleet of drones roams a mountain pass, spraying a fine mist along a known terrorist transit route – the US military's version of "CSI: Al Qaeda." Days later, when troops capture suspects hundreds of miles away, they test them for traces of the "taggant" to discover whether they have traversed the trail and may, in fact, be prosecuted as insurgents.
Welcome to the battlefield of the future. Malleable robots. Insect-size air forces. Chemical tracers spritzed from the sky. It's the stuff of science fiction.
But these are among the myriad futuristic war*fighting creations currently being developed at universities across the country with funds from the US military. And the future, in many cases, may not be too far off.
Engineering students at the Naval Postgraduate School in Monterey, Calif., for instance, are now experimenting with chemical taggants on unmanned aerial vehicles (UAVs) like the ones being used in Afghanistan. Sure, the shape-changing chemical robot that slips through cracks may be more Ray Bradbury than battlefield-ready. But the Pentagon, in its perpetual quest to find the next weapon or soldier-saving device – and with scientific assurances that it's possible – is already investing millions to develop it.
"We're not about 20 years, or 10 years, or even five years away – a lot of this could be out in the field in under two years," says Mitchell Zatkin, former director of programmable matter at the Defense Advanced Research Projects Agency, or DARPA, the Pentagon's premier research office.
The development of a new generation of military robots, including armed drones, may eventually mark one of the biggest revolutions in warfare in generations. Throughout history, from the crossbow to the cannon to the aircraft carrier, one weapon has supplanted another as nations have strived to create increasingly lethal means of allowing armies to project power from afar.
But many of the new emerging technologies promise not only firepower but also the ability to do something else: reduce the number of soldiers needed in war. While few are suggesting armies made up exclusively of automated machines (yet), the increased use of drones in Afghanistan and Pakistan has already reinforced the view among many policymakers and Pentagon planners that the United States can carry out effective military operations by relying largely on UAVs, targeted cruise missile strikes, and a relatively small number of special operations forces.
At the least, many enthusiasts see the new high-tech tools helping to save American lives. At the most, they see them changing the nature of war – how it's fought and how much it might cost – as well as helping America maintain its military preeminence.
Yet the prospect of a military less reliant on soldiers and more on "push button" technologies also raises profound ethical and moral questions. Will drones controlled by pilots thousands of miles away, as many of them are now, reduce war to an antiseptic video game? Will the US be more likely to wage war if doing so does not risk American lives? And what of the oversight role of Congress in a world of more remote-control weapons? Already, when lawmakers on Capitol Hill accused the Obama administration of circumventing their authority in waging war in Libya, White House lawyers argued in essence that an operation can't be considered war if there are no troops on the ground – and, as a result, does not require the permission of Congress.
"If the military continues to reduce the human cost of waging war," says Lt. Col. Edward Barrett, an ethicist at the US Naval Academy in Annapolis, Md., "there's a possibility that you're not going to try hard enough to avoid it."
Beneath a new moon, a crew pushes the 2,500-pound Predator drone toward a blacked-out flight line and prepares it for takeoff. The soldiers wheel over a pallet of Hellfire missiles and load them onto the plane's undercarriage. The Predator pilot walks around the aircraft, conducting his preflight check. He then returns to a nearby trailer, sits down at a console with joysticks and monitors, and guides the snub-nosed plane down the runway and into the night air – unmanned and fully armed.
The takeoffs of Predators with metronome regularity here at Kandahar Air Field, in southern Afghanistan, has helped turn this strip of asphalt into what the Pentagon calls the single busiest runway in the world. An aircraft lifts off or lands every two minutes. It's a reminder of how integral drones have become to the war in Afghanistan and the broader war on terror.
Initially, of course, the plan was not to put weapons on Predator drones at all. Like the first military airplanes, they were to be used just for surveillance. As the war in Iraq progressed, however, US service members jury-rigged the drones with weapons. Today, armed Predators and their larger offspring, Reapers, fly over America's battlefields, equipped with both missiles and powerful cameras, becoming the most widely used and, arguably, most important tools in the US arsenal.
Since first being introduced in Iraq and Afghanistan, their numbers have grown from 167 in 2002 to more than 7,000 today. The US Air Force is now recruiting more UAV pilots than traditional ones.
"The demand has just absolutely skyrocketed," says the commander of the Air Force's 451st Operations Group, which runs Predator and Reaper operations in Kandahar.
As their numbers have grown, so has the sophistication with which the military uses them. The earliest drones operated more as independent assets – as aerial eyes that sent back intelligence and dropped their bombs. But today the unmanned aircraft are integrated into almost every operation on the ground, acting as advanced scouts and omniscient surveyors of battle zones. They monitor the precise movements of insurgents and kill enemy leaders. They conduct "virtual lineups," zooming in powerful cameras to help determine whether a suspected insurgent may have carried out a particular attack.
"A lot of the ground commanders won't execute a mission without us," says the Air Force's commander of the 62nd Expeditionary Reconnaissance Squadron in Afghanistan.
Robots, too, have become a far more pervasive presence on America's fields of battle. Remote-control machines that move about on wheels and tracks scour for roadside bombs in Iraq and Afghanistan. Soldiers in the mountains of eastern Afghanistan carry hand-held drones in backpacks, which they assemble and throw into the air to scope out terrain and check for enemy fighters. In the past 10 years, the Pentagon's use of robots has grown from zero to some 12,000 in war zones today.
Part of the exponential rise in the use of UAVs and robots stems from a confluence of events: improvements in technology and America's prolonged involvement in two simultaneous wars.
There is, too, the prospect of more money for military contractors eyeing a downturn in future defense budgets. Today, the amount of money being spent on research for military robotics surpasses the budget of the National Science Foundation, which, at $6.9 billion a year, funds nearly one-quarter of all federally supported scientific research at the nation's universities.
Military officials also see in the new technologies the possibility of savings in an era of shrinking budgets. Deploying forces overseas can now cost as much as $1 million a year per soldier.
Yet the biggest allure of the new high-tech armaments may be something as old as conflict itself: the desire to reduce the number of casualties on the battlefield and gain a strategic advantage over the enemy. As Lt. Gen. Richard Lynch, a commander in Iraq, observed at a conference on military robotics in Washington earlier this year: "When I look at the 153 soldiers who paid the ultimate sacrifice [under my command], I know that 80 percent of them were put in a situation where we could have placed an unmanned system in the same job."
Drones, in particular, seem the epitome of risk-free warfare for the nation using them – there are, after all, no pilots to shoot down. Moreover, the people who run them are often nowhere near the field of battle. Some 90 percent of the UAV operations over Afghanistan are flown by people in trailers in the deserts of Nevada. In Kandahar, soldiers help the planes take off and land and then hand over controls to the airmen in the US.
"We want to minimize the [human] footprint as much as possible," says the 451st Operations Group commander at the Kandahar airfield, where the effects of being close to the war are clearly visible: The plywood walls of the tactical operations center are lined with framed bits of jagged metal from mortars that have fallen on the airfield over the years.
While the distant control of drones may well protect American lives, it raises questions about what it means to have people so far removed from the field of conflict. "Sometimes you felt like God hurling thunderbolts from afar," says Lt. Col. Matt Martin, who was among the first generation of US soldiers to work with drones to wage war and who has written a book – "Predator: The Remote-Control Air War Over Iraq and Afghanistan: A Pilot's Story."
Martin agrees that the unmanned aircraft no doubt reduce American casualties, but wonders if it makes killing "too easy, too tempting, too much like simulated combat, like the computer game Civilization."
It probably doesn't reassure critics that the flight controls for drones over the years have come to resemble video-game contollers, which the military has done to make them more intuitive for a generation of young soldiers raised on games like Gears of War and Killzone.
Martin knows what it's like to confront the dark side of war, even as he fought it from afar. During one operation, he was piloting a drone that was tracking an insurgent. Just after he fired one of the aircraft's missiles, two children rode their bicycles into range. They were both killed. "You get good at compartmentalizing," says Martin.
What worries critics is those who are too good at it – and the impact in general of waging war at a distance. Some fret about the mechanics of the decisionmaking process: Who ultimately makes the decision to pull the trigger? And how do you decide whom to put on the hit list – a top Al Qaeda official, yes, but is some petty but persistent insurgent a matter of national security?
As the US increasingly uses drones in its secret campaigns, questions arise about how much to inform America's allies about UAV attacks and whether they alienate local populations more than they help subdue the enemy, which the US has starkly, and almost weekly, confronted with its drone campaign in Pakistan.
From the US military's viewpoint, the drone war has been fantastically successful, helping to kill key Al Qaeda operatives and Taliban insurgents with a minimum of civilian casualties and almost no US troops put at risk.
Some even believe that the ethical oversight of drones is far more rigorous than that of manned aircraft, since at least 150 people – ground crews, engineers, pilots, intelligence analyzers – are typically involved in each UAV mission.
The issue of what's a minimum of civilian losses is, of course, subjective. In 2009, the Brookings Institution, a Washington think tank, estimated that the US drone war was killing about 10 civilians for every 1 insurgent in Pakistan. That may be far fewer casualties than would be killed with traditional airstrikes. But it is hardly comforting to the Pakistanis.
Moreover, the very practice of taking out enemy leaders or sympathizers could at some point, according to detractors, devolve into an aerial assassination campaign. When the US used a drone strike last month to kill jihadist cleric and American-born Anwar al-Awlaki in Yemen, President Obama hailed it as a "major blow" to Al Qaeda in the Arabian Peninsula. But some critics decried the killing of a US citizen with no public scrutiny.
Barrett, who is the director of research at the Naval Academy's Stockdale Center for Ethical Leadership, discusses with his students the prospect of whether UAVs make it easier to wage war if the government doesn't have to worry about a public outcry. "There are not the mass numbers of troops moving around and visible, so it could be easier to circumvent the oversight of Congress and, therefore, legitimate authority," he notes.
Others ask a more simple but practical question: What about the troops who conduct the UAV strikes from the Nevada desert – could they become legitimate targets of America's enemies at, say, a local mall, bringing the war on terror to the suburbs?
Some worry that the US is, in fact, placing too heavy a burden on its UAV troops. Despite warnings that "video-game warfare" might make them callous to killing, new studies suggest that the stress levels drone operators face are higher than those for infantry forces on the ground.
"Having this idea of a 'surgical war' where you can really just pinpoint the bad guys with the least amount of damage to our own force, there's a bit of naiveté in all that," says Maryann Cusinamo Love, an associate professor at Catholic University of America in Washington, D.C.
She says the powerful cameras on the drones allow pilots to see in "great vivid detail the real-time results of their actions. That is an incredible stress on them."
It is also, she argues, a "ghettoization of the killing function in war." However justified the military mission may be, she says, "You are still giving the most stressful job of war disproportionately to this one subset of people."
Nearly as long as militaries have existed, they have invented arms to keep their soldiers as far away from danger as possible. Some sound ridiculous, others terrifying, but most have raised questions of fairness in warfare.
During World War II, Japanese forces used the jet stream to launch paper "fire balloons" rigged with bombs meant to explode when they drifted over US soil. One such balloon discovered by an American family during a picnic in the Oregon woods resulted in the only deaths in the continental US caused by enemy hostilities in the war.
For their part, US scientists experimented with a form of bio-inspired warfare: a "bat bomb" that they planned to launch in parachute-rigged casings over Japan. They imagined fitting the bodies of tiny bats with incendiary bombs on timers. The theory was that the bats, once dropped, would roost in the eaves and attics of Japan's delicate wooden dwellings, setting off fires. The technology was successfully tested but scrapped when it was deemed too expensive by the Pentagon.
On the Western front, Germany was experimenting with a remote-control tank known as the Goliath. It used technology pioneered by an American who had demonstrated a remote-control boat years earlier at Madison Square Garden in New York City. When he tried to sell his technology to the US military, however, he was met with ridicule.
"He said, 'I've got this technology,' but they started laughing – they thought he was crazy," says Peter Singer, author of "Wired for War: The Robotics Revolution and Conflict in the 21st Century."
With the advent of the US wars in Iraq and Afghanistan, however, technology has once again rendezvoused with military necessity. A company called iRobot in Bedford, Mass., sent a prototype of its PackBot, which soldiers began using to clear caves and bunkers suspected of being mined. When the testing period was over, "The Army unit didn't want to give the test robot back," Mr. Singer notes.
While the use of robots that can detect and defuse explosives is growing exponentially, the next big frontier for America's military R2-D2s may parallel what happened to drones: They may be fitted with weapons – offering new fighting capabilities as well as raising new concerns.
Already, researchers are experimenting with attaching machine guns to robots that can be triggered remotely. Field tests in Iraq for one of the first weaponized robots, dubbed SWORDS, didn't go well.
"There were several instan*ces of noncommanded firing of the system during testing," says Jef*frey Jacz*kow*ski, deputy manager of the US Army's Robotic Systems Joint Project Office.
Though US military officials tend to emphasize that troops must remain "in the loop" as robots or drones are weaponized, there remains a strong push for automation coming from the Pentagon. In 2007, the US Army sent out a request for proposals calling for robots with "fully autonomous engagement without human intervention." In other words, the ability to shoot on their own.
"Let's put it this way," says Lt. Col. David Thomp*son, project manager of the Army's robotic office. "We've seen the success of unmanned air vehicles that have been armed. This [weaponizing robots] is a natural extension."
At the Georgia Institute of Technology in Atlanta, Ronald Arkin is researching a stunning premise: whether robots can be created that treat humans on the battlefield better than human soldiers treat each other. He has pored over the first study of US soldiers returning from the Iraq war, a 2006 US Surgeon General's report that asked troops to evaluate their own ethical behavior and that of their comrades.
He was struck by "the incredibly high level of atrocities that are witnessed, committed, or abetted by soldiers." Modern warfare has not lessened the impact on soldiers. It is as stressful as ancient hand-to-hand combat with axes, he argues, because of the sorts of quick decisions that fighting with modern technology requires.
"Human beings have never been designed to operate under the combat conditions of today," he says. "There are many, many problems with the speed with which we are killing right now – and that exacerbates the potential for violation of laws of war."
With Pentagon funding, Dr. Arkin is looking at whether it is possible to build robots that behave more ethically than humans – to not be tempted to shoot someone, for instance, out of fear or revenge.
The key, he says, is that the robot should "first do no harm, rather than 'shoot first, ask questions later.' "
Such technology requires what Arkin calls an "ethical adaptor," which involves following orders. Learning, he explains, is potentially dangerous when it comes to making decisions about whether to kill. "You don't want to hand soldiers a gun and say, 'Figure out what's right and wrong.' You tell them what's right and wrong," he says. "We want to do the same for these robotic systems."
The aim, says Arkin, is not to be perfect, "but if we can achieve this goal of outperforming humans, we have saved lives – and that is the ultimate benchmark of this work."
Other research into armed robots centers not so much on outperforming humans as being able to work with them. In the not-too-distant future, military officials envision soldiers and robots teaming up in the field, with the troops able to communicate with machines the way they would with a human squad team member. Eventually, says Thompson, the robot-soldier relationship could become even more collaborative, with one human soldier leading many armed robots.
After that, the scenarios start to become something more out of the realm of film studios. For instance, retired Navy Capt. Robert Moses, president of iRobot's government and industrial relations division, can envision the day of humanless battlefields.
"I think the first thing to do is to go ahead and have the Army get comfortable with the robot," he says. One day, though, "you could write a scenario where you have an unmanned battle space – a 'Star Wars' approach."
These developments raise questions that ethicists are just beginning to unravel. This includes Peter Asaro, who last year formed the International Committee for Robot Arms Control. He's grappling with conundrums like: What, to a machine, counts as "about to shoot me?" How does a robot make a distinction between a dog, a man, and a child? How does it tell an enemy from a friend?
Such things are not entirely abstract. An automated "sentry robot" now stands guard in the demilitarized zone between North and South Korea, equipped with heat, voice, and motion sensors, as well as a 5 mm machine gun. What if it starts firing, accidentally or otherwise?
Within their own ranks, military officials are asking themselves similar questions. In March, the Navy launched a program at its postgraduate school in Monterey that explores the legal, social, and cultural impacts of unmanned systems. "Are we going to give the ability to a robot for conducting a killing operation based on its own software and sensors?" asks retired Navy Capt. Jeffrey Kline, who is directing the new effort. "That rightly causes a lot of red flags."
In part, military officials feel they have to develop these new systems to stay ahead of America's enemies, many of whom will be creating their own versions of automated armies. Yet that could lead to what some consider a 21st-century arms race and encourage others to use the new weapons.
Late last month, federal authorities charged a Massachusetts man with plotting an attack on the US Capitol and the Pentagon using a large, remote-controlled aircraft filled with explosives. Earlier this year, Libyan rebels contacted Aeryon Labs Inc., a Canadian drone manufacturer, about buying a small unmanned helicopter. "Ultimately, I think they found us through Googling. That's how a lot of people find us," says Dave Kroetsch, Aeryon's president. Aeryon officials say they get inquires from militaries all over the world, which is one reason they have decided not to sell weaponized drones.
In the end, the emerging era of remote-control warfare – like evolutions in warfare throughout history – will likely create profound new capabilities as well as profound new problems for the US. The key will be to minimize the one over the other.
"There are many futures that can be created," says Georgia Tech roboticist Arkin. "Hopefully, we can create, I won't say a utopian, but at least not a dystopian one."