Advocates Ring Alarm on Civil Rights Abuses of Facial Recognition Technology in Policing

FOR IMMEDIATE RELEASE
Contact: Rachel Hooper, [email protected] 

WASHINGTON As the use of facial recognition technology by law enforcement across the nation continues to pose civil rights risks, The Leadership Conference Education Fund and the Center for Civil Rights and Technology today held a timely press briefing on the use of this technology and the real and potential harms it poses to historically marginalized communities. 

Facial recognition technology has inaccurately identified individuals, implicating them in crimes they didn’t commit and leading to wrongful incarceration. One research study examining three facial-analysis programs found the error rate for determining gender for light-skinned men was 0.8 percent, but errors rose to 34.7 percent for darker-skinned women. Recently, the U.S. Commission on Civil Rights released a report naming several concerns with the use of facial recognition technology by the government, including the wide variation of accuracy in determining positive and negative matches causing discriminatory practices and violating individual civil rights in law enforcement. The report quoted testimony presented by The Leadership Conference during a hearing held by the commission in March 2024.

Policy experts from The Leadership Conference and its Center for Civil Rights and Technology, Tierra Bradford and Frank Torres, in addition to social justice organizer Tawana Petty and public defender Elizabeth Daniel Vasquez, highlighted examples of harms and explained the federal safeguards needed to protect people’s civil rights. They also discussed next steps, such as a Department of Justice study on facial recognition technology and other biometric technology, which would help to carry out President Biden’s Executive Order on Advancing Effective, Accountable Policing and Criminal Justice Practices to Enhance Public Trust and Public Safety.

Tierra Bradford, senior manager of the justice reform program at The Leadership Conference on Civil and Human Rights, said: “Unfortunately in the criminal-legal system, technology like FRT that is meant to be an advancement to improve accuracy in investigations is actually harmful due to its inaccuracy and disproportionate misidentification of Black, Brown, and Indigenous communities. This technology, which relies on racially biased arrest data, continues to perpetuate the existing disparities that are already baked into the criminal-legal system. Because of their inaccuracy and misuse by police departments, FRT should not be used to identify suspects in police investigations.”

Frank Torres, AI and privacy fellow at the Center for Civil Rights and Technology, said: “Innocent people have been wrongfully arrested due to faulty facial recognition technology. Wrongful arrests don’t protect anyone, nor does faulty and biased technology. The American people deserve better, and law enforcement must be held accountable for the tech they deploy. Any type of policing technology, especially facial recognition, must be assessed and safeguarded before its use, not after harm has already occurred. Sometimes, this means technology should not be used at all by law enforcement.”

Tawana Petty, social justice organizer, said: “In Detroit, we already hold nearly half of the known misidentification cases, leading to arrests by police who leveraged facial recognition technology. Now we have at least one known case of license plate surveillance leading to misidentification and an unjustly impounded car for over three weeks. The dangers of living in a panopticonic city appear to outweigh any proven benefits.”

Elizabeth Daniel Vasquez, director of the Science & Surveillance Project at Brooklyn Defender Services, said: “Law enforcement’s use of FRT is not only a prime example of how racial bias is deep-rooted in police surveillance technology, but it also exemplifies modern policing’s voracious appetite for collecting as much data as possible. The secretive deployment of facial similarity algorithms has reproduced the biases of decades of racist policing, targeting Black and Brown communities for increased surveillance and resulting in a series of misidentifications and wrongful arrests. Meanwhile, when data biases have been identified, many proposed attempts to ‘improve’ the algorithms simply justify the extraction of even more data from policed communities. To break the surveillance cycle, it is critical that we not focus solely on the supposed accuracy of this technology, but also what it means for law enforcement to collect the data, how long are they keeping it, and how are they using it.”

The Leadership Conference also published a blog interviewing Tawana on her personal experience as a social justice organizer fighting against unjust uses of facial recognition technology in her community.

Video of the full press briefing is available here.

The Center for Civil Rights and Technology (Center) is a joint project of The Leadership Conference on Civil and Human Rights and The Leadership Conference Education Fund. The Center, launched in September 2023, serves as a hub for advocacy, education, and research at the intersection of civil rights and technology policy. Our experts dive into the most pressing policy issues in three key areas: AI and privacy, voting and platform accountability, and broadband access.

The Leadership Conference Education Fund builds public will for federal and state policies that promote and protect the civil and human rights of all persons in the United States. The Education Fund’s campaigns empower and mobilize advocates around the country to push for progressive change in the United States. It was founded in 1969 as the education and research arm of The Leadership Conference on Civil and Human Rights. For more information on The Education Fund, visit civilrights.org/edfund/. 

 

###