Letter to Axon AI Ethics Board regarding Ethical Product Development and Law Enforcement

View a PDF of this letter here. 

Recipient: Axon AI Ethics Board

Dear Axon AI Ethics Board:

We write to express our strong interest in the Board’s upcoming work to guide Axon on ethics issues, and our serious concerns with the current direction of Axon’s product development. We are a broad coalition of national and local civil rights and civil liberties groups. Many of us represent communities that are deeply affected by law enforcement abuses.

Law enforcement in this country has a documented history of racial discrimination. Some agencies have routinely and systematically violated human and constitutional rights. Some have harassed, assaulted, and even killed members of our communities. These problems are frequent, widespread, and ongoing.

Because Axon’s products are marketed and sold to law enforcement, they sometimes make these problems worse. For example, Axon’s body-worn camera systems, which should serve as transparency tools, are now being reduced to powerful surveillance tools that are concentrated in heavily policed communities.

Axon has a responsibility to ensure that its present and future products, including AI-based products, don’t drive unfair or unethical outcomes or amplify racial inequities in policing. Axon acknowledges this responsibility—the company states that it “fully recognize[s] the complexities and sensitivities around technology in law enforcement, and [is] committed to getting it right.”

This Board must hold Axon to its word. We urge the Board to assert the following at the outset of its work:

  1. Certain products are categorically unethical to deploy.

Chief among these is real-time face recognition analysis of live video captured by body-worn cameras. Axon must not offer or enable this feature. Real-time face recognition would chill the constitutional freedoms of speech and association, especially at political protests. In addition, research indicates that face recognition technology will never be perfectly accurate and reliable, and that accuracy rates are likely to differ based on subjects’ race and gender.[1] Real-time face recognition therefore would inevitably misidentify some innocent civilians as suspects. These errors could have fatal consequences—consequences that fall disproportionately on certain populations. Real-time face recognition could also prime officers to perceive individuals as more dangerous than they really are and to use more force than the situation requires. No policy or safeguard can mitigate these risks sufficiently well for real-time face recognition ever to be marketable.

  1. Robust ethical review requires centering the voices and perspectives of those most impacted by Axon’s technologies.

This Board includes well-respected academics, practitioners, advocates, and law enforcement representatives. But an ethics process that does not center the voices of those who live in the most heavily policed communities will have no legitimacy. The Board must invite, consult, and ultimately center in its deliberations the voices of affected individuals and those that directly represent affected communities. In particular, survivors of mass incarceration, survivors of law enforcement harm and violence, and community members who live closely among both populations must be included.

  1. Axon must pursue all possible avenues to limit unethical downstream uses of its technologies.

Axon’s product design decisions can sometimes prevent certain unethical uses of its products, but design decisions alone are insufficient to ensure that the company’s products are used ethically. The Board should propose novel ways to limit unethical uses of Axon’s products. For instance, with the Board’s help, Axon could develop contractual terms that prohibit customers from using its products in unethical ways, and that allow Axon to withdraw products from certain customers if it learns of unethical or unlawful uses. The company could also refuse to sell a particular technology or feature to an agency unless the agency adopts vital policy safeguards that are transparent, enforceable, and supported by impacted communities. Axon could also make it easier for the public to learn how law enforcement agencies use its products by including public transparency and accountability directly in its design decisions. If Axon cannot effectively limit downstream unethical uses for a particular product, the Board should recommend against the development or sale of that product.

  1. All of Axon’s digital technologies require ethical review.

This Board should ensure that its scope includes all of Axon’s digital products, both because they could be data sources in the development of future AI products, and because they implicate independent ethical concerns. For example, Axon’s Evidence.com is a massive central repository of digital evidence that, if improperly handled, would compromise the safety and privacy of both officers and civilians. Another existing product, Axon Citizen, allows community members to submit tips and evidence to law enforcement, which could amplify racial bias and other discriminatory behavior. All of Axon’s current and future digital products should be examined by this Board.

We look forward to engaging with the Board as its work moves forward.

Signed,

18MillionRising.org
ACLU
AI Now Institute at NYU
Algorithmic Justice League
American Friends Service Committee
Center for Media Justice
Center on Privacy & Technology at Georgetown Law
Color of Change
Communities United for Police Reform (CPR)
Data for Black Lives
Democracy NC
Detroit Community Technology Project
Electronic Frontier Foundation
Electronic Privacy Information Center (EPIC)
Ella Baker Center for Human Rights
Fayetteville PACT
Free Press
Law for Black Lives – DC
Lawyers’ Committee for Civil Rights Under Law
Legal Aid Society
Media Alliance
Media Mobilizing Project
NAACP
NAACP Legal Defense and Educational Fund, Inc.
National Hispanic Media Coalition
National Urban League
NC Black Leadership and Organizing Collective
NC Black Womens Roundtable
NC Statewide Police Accountability Network
New America’s Open Technology Institute
Open MIC (Open Media and Information Companies Initiative)
Our Data Bodies Project
Siembra NC
South Asian Americans Leading Together (SAALT)
The Leadership Conference Education Fund
The Leadership Conference on Civil and Human Rights
The Tribe
UnidosUS
Upturn
Urbana-Champaign Independent Media Center
WITNESS
Working Narratives

[1] For example, researchers at MIT recently demonstrated that multiple commercially available face characterization algorithms—performing a far simpler task than face recognition—exhibited disproportionally high error rates when presented with darker-skinned faces, and the highest error rates when presented with the faces of dark-skinned females. Joy Buolamwini & Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification (2018), http://proceedings.mlr.press/
v81/buolamwini18a/buolamwini18a.pdf
.