Op-ed: Block the misuse of facial recognition, not the tech itself

Staking a claim in the public debate over facial recognition software, presidential hopeful Sen. Bernie Sanders (I-Vt.) has called for a ban on police use of the technology. While cities like San Francisco, Oakland, Calif., and Somerville, Mass. have already banned local law enforcement agencies from using it, Sanders is the first presidential candidate to take such an aggressive stand.

There are some well-documented reasons that make such a ban worth exploring — so long as the policy is narrowly targeted and not overly blunt. It should not hinder the future of the technology in other applications where it shows real potential.

It is possible that using facial recognition software could land innocent people in jail. In 2018, the American Civil Liberties Union (ACLU) demonstrated this. Using Amazon’s software, “Rekognition,” the ACLU showed how these types of systems can make serious mistakes. Searching a database of 25,000 publicly available arrest photos against every member of the U.S. House of Representatives and Senate, the system incorrectly identified 28 as criminals. People of color were disproportionately falsely matched, including six members of the Congressional Black Caucus, among them civil rights leader Rep. John Lewis (D-Ga.).

In August, the ACLU repeated the experiment with California’s legislature. Almost one in five legislators were incorrectly identified as criminals. London’s Scotland Yard has been monitoring crowds with facial recognition since August 2016. The first independent evaluation of the system, commissioned by the police force, found an error rate of 81 percent. A 2018 report on the use of facial recognition by another UK law enforcement agency, the South Wales Police, found that it misidentified 2,300 people as potential criminals.

These results should be alarming, but not surprising. Multiple studies have found that while facial recognition algorithms are really accurate at identifying white men, they perform poorly when shown images of women and people of color. One study found that systems misidentify African American women 35 percent of the time. Another study found that while these systems can almost flawlessly identify the gender of lighter-skinned men, they mistook women for men 19 percent of the time and mistook darker-skinned women for men 31 percent of the time.

These inaccuracies become even more concerning when you look at how the technology is currently being used by law enforcement. In my home state of Utah, for example, law enforcement agencies have regularly used facial recognition technology. Over the course of 15 months from July 2016 through October 2017, the state’s Department of Public Safety conducted 1,291 face recognition searches — according to my own calculations — on Utah driver’s license photos — largely at the request of other state and federal agencies. Nearly 15 percent (188) returned a “possible” or “likely” match.

Given the well-documented problems uncovered in America and the United Kingdom, it’s hard to have complete confidence in these results. Utah’s searches are just a drop in the bucket compared to the more than 390,000 facial-recognition searches the FBI conducted of federal and local databases, including state DMV databases, between 2011 and 2019.

Rethinking the use of facial recognition software by law enforcement does not mean writing off the technology, which has some very real benefits. From giving “sight” to the blind to identifying drowsy drivers to robot pets used to complement dementia care, there are a number of commercial applications that could dramatically improve lives around the world. As inaccurate as some are now, they will improve with time. But, unlike law enforcement use, inaccuracies in these systems don’t have serious costs like the potential to throw innocent people into the criminal justice system.

It’s wise to be cautious about how governments and law enforcement agencies use facial recognition. Mistakes inevitably happen with emerging technology, but the criminal justice system is not the place for mistakes. On the other hand, we should not let these fears undermine an otherwise-promising technology. Bernie Sanders is right to be concerned — let’s just make sure to remain focused on the real issues at hand.

CGO scholars and fellows frequently comment on a variety of topics for the popular press. The views expressed therein are those of the authors and do not necessarily reflect the views of the Center for Growth and Opportunity or the views of Utah State University.