Civil Rights and Tech Advocates Sound Alarm on Racial Bias in ‘Predictive Policing’
WASHINGTON – Today a broad coalition of civil rights, privacy, and technology organizations issued a sweeping rebuke to the use and misuse of predictive policing products by police departments as the technology and policy firm Upturn released a report on the topic.
The 17 organizations signed a shared statement of civil rights concerns about the systemic flaws, inherent bias, and lack of transparency endemic to predictive policing products and their vendors. The groups point out how these products exacerbate deep dysfunction and disproportionate policing practices that are “systemically biased against communities of color and allow unconscionable abuses of police power.”
Signers of the statement include The Leadership Conference on Civil and Human Rights, 18 Million Rising, American Civil Liberties Union, Brennan Center for Justice, Center for Democracy & Technology, Center for Media Justice, Color of Change, Data & Society Research Institute, Demand Progress, Electronic Frontier Foundation, Free Press, Media Mobilizing Project, NAACP, National Hispanic Media Coalition, Open MIC (Open Media and Information Companies Initiative), Open Technology Institute at New America, and Public Knowledge.
A new report entitled “Stuck in a Pattern” from the technology and policy firm Upturn — released with today’s statement — finds “a trend of rapid, poorly informed adoption” of predictive policing, with “pervasive, fundamental gaps in what’s publicly known” about how these systems work.
Predictive policing products are marketed with names like “Beware,” “Hunchlab,” and “PredPol.” Police nationwide have begun using them to predict who might commit crimes, or where crimes might be committed, and to target policing to those people and places. However, as the groups state, “The data driving predictive enforcement activities … is profoundly limited and biased. … As a result, current systems reinforce bias and sanitize injustice.”
Among six key concerns in the statement, the groups condemn departments for not disclosing or seeking public input on the use of these products, and the vendors for “shrouding their products in secrecy, and even seeking gag clauses or asking departments to pledge to spend officer time resisting relevant public records requests.”
The statement also notes promising uses of data in policing, pointing out that, “Even within a broken criminal justice system, there are places where data can be a force for good: For example, data can identify mentally ill people for treatment rather than punishment, or provide early warning of harmful patterns of officer behavior. However, today, most ‘predictive policing’ is not used for such constructive interventions.”
“These technologies threaten the Constitution’s promises of equal protection under the law and due process, and its protections against unreasonable searches and seizures,” said Wade Henderson, president and CEO of The Leadership Conference on Civil and Human Rights. “This is fortune-teller policing that uses deeply flawed and biased data and relies on vendors that shroud their products in secrecy. Instead of using predictive technology to correct dysfunctional law enforcement, departments are using these tools to supercharge discrimination and exacerbate the worst problems in our criminal justice system.”
To request an interview with an expert on the use of these products or a signer of the statement, email [email protected].