Researchers have been pursuing facial recognition technologies since the 1960s but its only been in the last few years that these systems have become so alarmingly capable. Neural networks able to match faces across thousands of features with better than 98 percent accuracy as well as an explosion of available training data sets have fueled the field’s recent spate of advancements. However even though these algorithms can spot a person using low quality security camera footage, they’re still dreadful at differentiating between folks with darker skin tones.
Facial recognitions systems aren’t all bad. They offer people convenience, as any Apple Face ID user can attest, as well as quick and seamless security access. Unless you’re stuck in a Nick Cage-John Travolta action thriller, you’ll never have to worry about someone gaining unauthorized access to your devices. The technology has also proven a boon to law enforcement, enabling officers to more quickly track down suspects, as the NYPD did with an armed rapist last August, or identifying lost children and addled senior citizens. Heck, even pop star Taylor Swift uses the technology at her shows to foil stalkers.
But the same power and versatility that makes facial recognition so useful is what makes it so dangerous. China’s authoritarian government has long used mass surveillance and facial recognition to keep tabs on its citizens. London’s Met just last week followed suit, announcing that it will formalize its use of the controversial ClearView system which can track people in real-time. A number of American law enforcement organizations are equally eager to install or in some cases expand their surveillance capabilities at the expense of its citizens privacy and eroding civil liberties. And yet, despite the technology’s shortcomings in accuracy and massive potential for misuse, only a trio of states have sought to stop its adoption. For better or worse, facial recognition is not going away; it’s now a question of how much damage it will do before our elected leaders seek to restrain it.
All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you buy something through one of these links, we may earn an affiliate commission.