Tuesday, June 5, 2012

ROBOTS GO TO WAR 2

Dragonflies, fleas and dogs
Military robots come in an astonishing range of shapes and sizes. DelFly, a dragonfly-shaped surveillance drone built at the Delft University of Technology in the Netherlands, weighs less than a gold wedding ring, camera included. At the other end of the scale is America’s biggest and fastest drone, the $15m Avenger, the first of which recently began testing in Afghanistan. It uses a jet engine to carry up to 2.7 tonnes of bombs, sensors and other types of payload at more than 740kph (460mph).
On the ground, robots range from truck-sized to tiny. TerraMax, a robotics kit made by Oshkosh Defense, based in Wisconsin, turns military lorries or armoured vehicles into remotely controlled or autonomous machines. And smaller robotic beasties are hopping, crawling and running into action, as three models built by Boston Dynamics, a spin-out from the Massachusetts Institute of Technology (MIT), illustrate.
By jabbing the ground with a gas-powered piston, the Sand Flea can leap through a window, or onto a roof nine metres up. Gyro-stabilisers provide smooth in-air filming and landings. The 5kg robot then rolls along on wheels until another hop is needed—to jump up some stairs, perhaps, or to a rooftop across the street. Another robot, RiSE, resembles a giant cockroach and uses six legs, tipped with short, Velcro-like spikes, to climb coarse walls. Biggest of all is the LS3 (pictured above), a four-legged dog-like robot that uses computer vision to trot behind a human over rough terrain carrying more than 180kg of supplies. The firm says it could be deployed within three years.
Demand for land robots, also known as unmanned ground vehicles (UGVs), began to pick up a decade ago after American-led forces knocked the Taliban from power in Afghanistan. Soldiers hunting Osama bin Laden and his al-Qaeda fighters in the Hindu Kush were keen to send robot scouts into caves first. Remote-controlled ground robots then proved enormously helpful in the discovery and removal of makeshift roadside bombs in Afghanistan, Iraq, and elsewhere. Visiongain, a research firm, reckons a total of $689m will be spent on ground robots this year. The ten biggest buyers in descending order are America, followed by Israel, a distant second, and Britain, Germany, China, South Korea, Singapore, Australia, France and Canada.
Robots’ capabilities have steadily improved. Upload a mugshot into an SUGV, a briefcase-sized robot than runs on caterpillar tracks, and it can identify a man walking in a crowd and follow him. Its maker, iRobot, another MIT spin-out, is best known for its robot vacuum cleaners. Its latest military robot, FirstLook, is a smaller device that also runs on tracks. Equipped with four cameras, it is designed to be thrown through windows or over walls.
Another throwable reconnaissance robot, the Scout XT Throwbot made by Recon Robotics, based in Edina, Minnesota, was one of the stars of the Ground Robotics Capabilities conference held in San Diego in March. Shaped like a two-headed hammer with wheels on each head, the Scout XT has the heft of a grenade and can be thrown through glass windows. Wheel spikes provide traction on steep or rocky surfaces. In February the US Army ordered 1,100 Scout XTs for $13.9m. Another version, being developed with the US Navy, can be taken to a ship inside a small aquatic robot, and will use magnetic wheels to climb up the hull and onto the deck, says Alan Bignall, Recon’s boss.
Even more exotic designs are in development. DARPA, the research arm of America’s Department of Defence, is funding the development of small, soft robots that move like jerky slithering blobs. EATR, another DARPA project, is a foraging robot that gathers leaves and wood for fuel and then burns it to generate electricity. Researchers at Italy’s Sant’Anna School of Advanced Studies, in Pisa, have designed a snakelike aquatic robot. And a small helicopter drone called the Pelican, designed by German and American companies, could remain aloft for weeks, powered by energy from a ground-based laser.
Robots on the front line: iRobot’s SUGV; the Recon Robotics Scout XT Throwbot; and Rafael’s Samson Remote Weapon Station
All this technology may not always provide a meaningful advantage. This year the US Marine Corps will start testing Boston Dynamics’s four-legged beast of burden, the LS3. Its elaborate design keeps it upright even on rocky ground, and it is very difficult to knock over. But its petrol engine makes it as loud as a lawnmower. The Taliban have a much stealthier system, notes a former French army lieutenant. Their mules quietly eat grass.
A slippery slope to war?
A larger worry is that countries with high-performance military robots may be more inclined to launch attacks. Robots protect soldiers and improve their odds of success. Using drones sidesteps the tricky politics of putting boots on foreign soil. In the past eight years drone strikes by America’s Central Intelligence Agency (CIA) have killed more than 2,400 people in Pakistan, including 479 civilians, according to the Bureau for Investigative Journalism in London. Technological progress appears to have contributed to an increase in the frequency of strikes. In 2005 CIA drones struck targets in Pakistan three times; last year there were 76 strikes there. Do armed robots make killing too easy?
Not necessarily. When Mary Cummings, a former US Navy pilot, stopped flying F-18 fighter jets in 1997, there were no video links between cockpits and command centres, and even radio contact was patchy at times. As a result, pilots often made their own calls on whether or not to strike. Today’s drones, blimps, unmanned boats and reconnaissance robots collect and transmit so much data, she says, that Western countries now practise “warfare by committee”. Government lawyers and others in operation rooms monitor video feeds from robots to call off strikes that are illegal or would “look bad on CNN”, says Ms Cummings, who is now a robotics researcher at MIT. And unlike pilots at the scene, these remote observers are unaffected by the physical toil of flying a jet or the adrenalin rush of combat.
In March Britain’s Royal Artillery began buying robotic missiles designed by MBDA, a French company. The Fire Shadow is a “loitering munition” capable of travelling 100km, more than twice the maximum range of a traditional artillery shell. It can circle in the sky for hours, using sensors to track even a moving target. A human operator, viewing a video feed, then issues an instruction to attack, fly elsewhere to find a better target, or abort the mission by destroying itself. But bypassing the human operator to automate attacks would be, technologically, in the “realm of feasibility”, an MBDA spokesman says.
Could the “man in the loop” be removed from robotic weapons? The Israel Defence Forces have installed “combat proven” robot machineguns along the country’s borders. When sensors detect an intruder, the barrel pivots to follow him. A human soldier, watching the scene remotely via a fibre-optic link, decides whether or not to issue a warning (through a loudspeaker) or press the fire button. The robot sentry, the Samson Remote Weapon Station, could function without human intervention, says David Ishai of Rafael, its Israeli manufacturer, based in Haifa. But, he says, switching to automatic mode would be a bad idea—and illegal to boot.
Traditional rules of engagement stipulate that a human must decide if a weapon is to be fired. But this restriction is starting to come under pressure. Already, defence planners are considering whether a drone aircraft should be able to fire a weapon based on its own analysis. In 2009 the authors of a US Air Force report suggested that humans will increasingly operate not “in the loop” but “on the loop”, monitoring armed robots rather than fully controlling them. Better artificial intelligence will eventually allow robots to “make lethal combat decisions”, they wrote, provided legal and ethical issues can be resolved.
A report on the matter issued by Britain’s Ministry of Defence last year argued that if a drone’s control system takes appropriate account of the law on armed conflicts (basically military necessity, humanity, proportionality and the ability to distinguish between military targets and civilians), then an autonomous strike could meet legal norms. Testing and certifying such a system would be difficult. But the authors concluded that “as technology matures…policymakers will need to be aware of the potential legal issues and take advice at a very early stage of any new system’s procurement cycle.”
Pressure will grow for armies to automate their robots if only so machines can shoot before being shot, says Jürgen Altmann of the Technical University of Dortmund, in Germany, and a founder of the International Committee for Robot Arms Control, an advocacy group. Some robot weapons already operate without human operators to save precious seconds. An incoming anti-ship missile detected even a dozen miles away can be safely shot down only by a robot, says Frank Biemans, head of sensing technologies for the Goalkeeper automatic ship-defence cannons made by Thales Nederland.
Admittedly, that involves a machine destroying another machine. But as human operators struggle to assimilate the information collected by robotic sensors, decision-making by robots seems likely to increase. This might be a good thing, says Ronald Arkin, a roboticist at the Georgia Institute of Technology, who is developing “ethics software” for armed robots. By crunching data from drone sensors and military databases, it might be possible to predict, for example, that a strike from a missile could damage a nearby religious building. Clever software might be used to call off attacks as well as initiate them.
In the air, on land and at sea, military robots are proliferating. But the revolution in military robotics does have an Achilles heel, notes Emmanuel Goffi of the French air-force academy in Salon-de-Provence. As robots become more autonomous, identifying a human to hold accountable for a bloody blunder will become very difficult, he says. Should it be the robot’s programmer, designer, manufacturer, human overseer or his superiors? It is hard to say. The backlash from a deadly and well-publicised mistake may be the only thing that can halt the rapid march of the robots.

No comments:

Post a Comment