Beware Robot Hit Men

WIB politics July 20, 2016 War Is Boring 0

Image via Vice Motherboard Better facial recognition could lead to autonomous assassins by NATALIE O’NEILL From catching thieves to finding lost pets, facial recognition technology...
Image via Vice Motherboard

Better facial recognition could lead to autonomous assassins

by NATALIE O’NEILL

From catching thieves to finding lost pets, facial recognition technology has already done a fair bit of good for humanity. But artificial intelligence experts warn new developments in the field could soon trigger a more troubling use of facial recognition software — weapons that function like robot hit men, complete with “vision” as accurate as the human eye.

The controversial new technology is poised to hit the market in a few years and could spark “a third revolution in warfare,” said the University of Montreal’s Yoshua Bengio, who leads the foremost research group on the powerful A.I. technique known as “deep learning.”

Unlike weaponized drones, which remote human pilots use to target geographic areas based on various “signatures” of people the U.S. government believes to be militants, Bengio predicts the killing machines of the future will be precise enough to recognize — and take out — a single person in a crowd of thousands.

And perhaps the most frightening difference is that tomorrow’s killer robots could act on their own, pulling the trigger with no human being in the loop, he added.

Kill Chain: The Rise of the High-Tech Assassins

“There could be an arms race for these types of things,” Bengio told me. “In the wrong hands, it could be used in a way that’s against our ethics. It could be used to eliminate someone, like poof.”

This kind of autonomous assassin would be able to roll or fly. It could be as big as a quadcopter or as small as a bird, programmed to hunt down a person by matching his or her face to a database of images. Bengio said this sort of tech will first be used by the military or law enforcement, possibly in a matter of years.

On July 7, 2016, after a sniper gunned down five Dallas police officers at a Black Lives Matter demonstration, police used a “bomb robot” to kill a suspect on American soil for the first time ever. Although a human being drove the explosive-rigged robot and pulled the trigger, the incident was a dramatic reminder that machines are capable of doing humanity’s lethal dirty work at home and abroad.

CSKR/Flickr photo

The sort of robotic hit men that Bengio warns of don’t yet exist, although the mere thought of the technology, optimized with enhanced facial recognition capabilities, raises a knot of privacy and ethical concerns. In the future, law enforcement will likely collect the images of American citizens — possibly even those without criminal records — for databases, critics have predicted.

As of now there’s no legislation to ban lethal autonomous weapons anywhere in the world, but advocacy groups like the Campaign to Stop Killer Robots are pushing for an international treaty to stop the development of lethal fully autonomous weapons. The farther removed humans are from the frontlines of war, the easier it is to kill, according to the CSKR.

“It is crucially important for the international community to establish a norm that prohibits delegating the authority to take human lives to machines,” Peter Asaro, a representative for the campaign, told me.

Cops’ Robo-Bomb Is the Moment When a Defensive Technology Becomes a Weapon

Meanwhile, in recent years researchers have made sweeping advances in the speed and accuracy of facial recognition systems. The systems analyze the characteristics of a person’s face, including distances between the eyes, nose, and mouth, before matching that to a database of images. Scientists have unlocked the secrets to using the tool successfully at greater distances with less light, and even in the dark.

In Bengio’s field of deep learning, which seeks to create machines that mimic the human brain, computers are getting smarter when it comes to facial recognition. When a computer detects only half of a face, for example, it can use machine learning to guess the rest, according to Bengio and other deep learning experts.

Scientists already have access to the technology needed to create a robot hitman, Benigo and other experts said. Now, it’s only a matter of building one. “It might take time to engineer such a device, but the basic science exists, at least on the A.I. side — and I don’t see why not on the robotic weapon side,” he said.

Bengio said he isn’t aware of any firm that has gone public with a plan to build lethal weapons that target specific individuals autonomously. But U.S. military operations are depending more and more on robotic technology to carry out “kill lists,” according to “The Drone Papers,” published last year by The Intercept.

On the law enforcement side, the company Taser International plans to build police body cameras that use facial recognition to nab suspects.

Even scientists at the forefront of A.I. are alarmed by the possibilities. Thousands of respected scientists — including Stephen Hawking and Noam Chomsky — came together last July in an open letter that called for a global ban on autonomous weapons. In their letter, the signatories urged lawmakers and tech honchos to proceed with caution by supporting international agreements to ban building the weapons.

“The stakes are high — autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms,” reads the letter, which is signed by Bengio and 17,700 other people. “Starting a military AI arms race is a bad idea, and should be prevented by a ban on offensive autonomous weapons beyond meaningful human control.”

In April 2016, representatives from 14 countries and regions, including Zimbabwe, Pakistan and Palestine, gathered at an informal meeting of experts at the United Nations to demand a preemptive ban on robotic killing machines. The meeting helped the campaign earn more supporters worldwide, a rep for the group said. “Momentum is building rapidly.”

In December, Algeria, Chile, Costa Rica, Mexico and Nicaragua will consider whether to adopt the ban.

This story originally appeared at Vice Motherboard.


  • 100% ad free experience
  • Get our best stories sent to your inbox every day
  • Membership to private Facebook group
Show your support for continued hard hitting content.
Priced at $19.99 per year, the first 200 people to sign up will receive a free War is Boring T-Shirt.
Become a War is Boring subscriber