Face Recognition Technology

Law enforcement use of facial recognition poses a profound threat to personal privacy, political and religious expression, and the fundamental freedom to go about our lives without having our movements and associations covertly monitored and analyzed.

This technology can be used for identifying individuals in photos or videos, and law enforcement and other government agencies can use it to conduct dragnet surveillance of entire neighborhoods. Face surveillance technology is also prone to error, implicating people for crimes they haven’t committed.

It has been well documented by MIT, the Georgetown Center for Privacy and Technology, and the ACLU that these error rates – and the related consequences – are far higher for women and people with darker skin.

Regardless of your race or gender – and even if these disparate error rates were addressed – face surveillance must be stopped. Facial recognition surveillance presents an unprecedented threat to our privacy and civil liberties. It gives governments, companies, and individuals the power to spy on us wherever we go: tracking our faces at protests, political rallies, places of worship and more.



Technology Overview

Facial recognition technology is a form of computer vision, which seeks to understand and automate tasks that the human visual system can do. Computer vision tasks include methods for acquiring processing, analyzing, and understanding digital images and videos, and then extracting data from them. Computer Vision includes shape recognition, optical character recognition and more.

Figure 1: Example of Computer Vision

Figure 2: Example of Computer Vision

Two categories of Facial Analysis technologies:

Facial Recognition Technology (FRT) aims to both recognize and authenticate individuals with a positive identification of identity. This is achieved by extracting a feature set from a face image or video and comparing against a database. 

Estimation or Predictive Analysis refers to systems that rely on estimate algorithms that attempt to output a categorical quantity such as the age, emotional state or the degree of fatigue. 

Both present significant civil liberty violations as well as disproportionate harms for black and minority populations.

Figure 3: Example of simple facial identification, which is the computer vision identifying that a “face” is present. Recognition goes further to confirm the identity.

Racial and Gender Bias

This MIT Study examined facial-analysis software from IBM, Microsoft and Face++ and uncovered an error rate of 0.8 percent for light-skinned men but up to 34.7 percent for dark-skinned women. 
Researcher Joy Buolamwini presented this research In a United States House Committee on Oversight and Government Reform hearing on facial recognition:

Figure 4: An excerpt from Joy Buolamwini’s research, showing that even the most sophisticated facial analysis and facial recognition systems are markedly unreliable on darker women. Notice the gap between the lighter individuals on the right.

Zooming in on the hand and thermometer in Figure 2 above, we find that modifying the image so that the hand is lighter-skinned removes “gun” from the computer-generated predictions. While such divergent outcomes may happen only occasionally, these occasional manifestations of algorithmic bias against darker-skinned people could be the decisive factor in performing a wrongful arrest.

Figure 4: Impact of augmented pigmentation on computer-vision predictions.

These inaccuracies are broadly due to two factors:

  1. Unrepresentative training data: 
    1. Training data is the tool that allows machine learning technologies like facial recognition to be so powerful, allowing large datasets to be quickly analyzed. 
    2. Because these tools “learn” from the training data they are given, they inherit any biases or that went into the creation of those data sets
    3. So if a facial recognition algorithm is only shown faces of white men while being trained, it will be less able to identify anyone who is not a white man. 
  2. Encoded bias in algorithms
    1. Technology inherits the biases of the people who design it, and can not evolve or learn in a meaningful way beyond the parameters set by engineers.
    2. The overwhelming majority of software developers are middle-aged white men living in Silicon Valley.
    3. This lack of diversity means that errors will inevitably be introduced when these technologies interact with people and communities that exist outside the life experiences of Bay Area technologists.
    4. The way software is developed, starting with a “minimum viable product”, means that problems in the system are often not fixed until the system is live and impacting people.

History of Abuse of Private Information

Abuse of confidential databases by Police Departments in Minnesota has been widely reported:

“A 2013 report by the state’s legislative auditor estimated more than half of Minnesota’s 11,000 law enforcement users of the Driver and Vehicle Services website made questionable searches in fiscal year 2012.” 
“The auditor’s report came after several high-profile cases, including that of a former Department of Natural Resources employee charged with illegally viewing the driver records of at least 5,000 people, mostly women. A Minnesota police officer who sued several agencies after her driver’s license information was snooped received more than $1 million in settlements.”

Even when not being actively malicious and abusing confidential information, a combination of the culture of secrecy and technical ineptitude has led to accidental breaches of private information.

Nationwide, the Associated Press reports that nationally, hundreds of offices receive reprimands each year for abusing private information,

“Among those punished: an Ohio officer who pleaded guilty to stalking an ex-girlfriend and who looked up information on her; a Michigan officer who looked up home addresses of women he found attractive; and two Miami-Dade officers who ran checks on a journalist after he aired unflattering stories about the department.” 

Given unfettered access to facial recognition software will certainly expose vulnerable citizens to the asymmetrical power of officers and agents of the city.