Letter to House Energy and Commerce Committee on the American Data Privacy and Protection Act

View PDF of letter here.

Dear Chairman Pallone and Ranking Member McMorris Rodgers,

On behalf of The Leadership Conference on Civil and Human Rights, a coalition charged by its diverse membership of more than 230 national organizations to promote and protect the rights of all persons in the United States, we thank you for the opportunity to submit our views on H.R. 8152, “The American Data Privacy and Protection Act” (“ADPPA”). We continue to see the ADPPA as the best opportunity to finally provide a strong law to protect Americans’ privacy and civil rights. We believe that such a law is needed to ensure that companies, including “Big Tech,” are held accountable for the data they collect and use, especially when those actions impact individuals’ lives. We appreciate your work, and that of Chair Schakowsky, Ranking Member Bilirakis, and your staffs to advance this important legislation. We ask that this statement be entered into the record of the full committee mark-up of the ADPPA on July 20, 2022.

As we stated in our June 14, 2022 letter[1] for the subcommittee hearing on the ADPPA, privacy rights are civil rights. Well-drafted comprehensive federal consumer privacy legislation will protect civil and human rights, empower communities of color, and ensure opportunities are open for marginalized populations.

As the use of individuals’ most personal information becomes more pervasive with the rise of new technologies and the proliferation of algorithms, it is critical that an individual’s data not be used in ways that harm them. The ADPPA provides that protection by prohibiting the collection and use of data in ways that discriminate based on race, color, religion, national origin, sex, or disability. The bill also requires companies to assess the impact of the technology they use and take measures to mitigate potential harms. Thus, the ADPPA raises the bar for civil rights by addressing the collection and use of data related to technology and goes beyond what is included in existing state privacy laws, including California’s law. We continue to believe that an independent auditor should be used in conducting impact assessments to ensure that companies are held accountable.

The civil rights protections in the ADPPA remain strong and, as we stated in our earlier letter, will protect communities by:

  • Applying protections to the digital age: The bill prohibits the use of personal data to discriminate based on protected characteristics. This will address data practices and automated decision-making systems that have led to discrimination in housing,[2] employment,[3] credit,[4] education,[5] finance,[6] and other economic opportunities, which has negatively impacted communities of color.
  • Prohibiting algorithmic bias: This bill will prohibit algorithms from reproducing patterns of discrimination[7] in recruiting,[8] housing,[9] education,[10] finance,[11] mortgage lending,[12] credit scoring,[13] healthcare,[14] vacation rentals,[15] ridesharing,[16] and other services.
  • Requiring companies to perform impact assessments: The bill’s impact assessment provisions require companies to identify biases and mitigate harms: large companies like Google and Facebook will be required to assess their algorithms annually and submit annual algorithmic impact assessments to the FTC. Impact assessments must seek to mitigate harms related to: (1) advertising for housing, education, employment, healthcare, insurance, or credit and (2) access to or restrictions on places of public accommodation, and any disparate impact on the basis of an individual’s race, color, religion, national origin, gender, or disability status. The FTC is granted rulemaking authority to adopt rules establishing processes for submitting algorithmic impact assessments.
  • Requiring algorithms to be audited for bias: The bill requires companies to evaluate their algorithms at the design phase which will help identify potential discriminatory impacts before they are deployed. Training data, which can be a cause of bias in AI systems, must be included in the evaluation.

In addition to the civil rights protections included in the ADPPA, the bill’s privacy protections will require companies to minimize the data they collect, provide for greater transparency, and further limit use of an individual’s information by defining permissible and impermissible purposes for collecting, sharing, and using personal data. It is also important that the ADPPA’s protections are enforceable, including by the FTC, state attorneys general, and individuals, as well as by state privacy authorities. We have been clear throughout the legislative process that each of these elements are needed and recognize that staff has been working with stakeholders to improve and clarify those provisions of the bill.

As the legislation advances through the House of Representatives, we remain committed to ensuring that the bill remains as strong as possible with the civil rights protections remaining intact. Together with robust underlying privacy and enforcement provisions, the bill will provide assurance to communities that their data will not be used to discriminate against them. We look forward to continuing to work with you, lawmakers, your staffs, and other stakeholders as the ADPPA moves forward in the legislative process.

Thank you for your consideration of our views. If you have any questions, please contact Anita Banerji, Senior Program Director, Media and Tech, at [email protected], or Frank Torres, Civil Rights Technology Fellow, at [email protected]

Sincerely,

Maya Wiley
President and CEO

Jesselyn McCurdy
Executive Vice President of Government Affairs

 

[1] The Leadership Conference on Civil and Human Rights Views on Discussion Draft of The American Data and Privacy Act – The Leadership Conference on Civil and Human Rights (civilrights.org)

[2] https://blogs.law.columbia.edu/hrlr/files/2020/11/251_Schneider.pdf.

[3] https://link.springer.com/article/10.1007/s40685-020-00134-w#Abs1.

[4] https://cpb-us-e1.wpmucdn.com/sites.suffolk.edu/dist/3/1172/files/2014/01/Rice-Swesnik_Lead.pdf.

[5] https://www.nytimes.com/2020/08/20/world/europe/uk-england-grading-algorithm.html.

[6] http://faculty.haas.berkeley.edu/morse/research/papers/discrim.pdf.

[7] https://www.demos.org/sites/default/files/2021-05/Demos_%20D4BL_Data_Capitalism_Algorithmic_Racism.pdf.

[8] https://www.upturn.org/static/reports/2018/hiring-algorithms/files/Upturn%20–%20Help%20Wanted%20-%20An%20Exploration%20of%20Hiring%20Algorithms,%20Equity%20and%20Bias.pdf.

[9] https://blogs.law.columbia.edu/hrlr/files/2020/11/251_Schneider.pdf.

[10] https://cdt.org/insights/algorithmic-systems-in-education-incorporating-equity-and-fairness-when-using-student-data/.

[11] https://faculty.haas.berkeley.edu/morse/research/papers/discrim.pdf.

[12] https://www.cnbc.com/2020/08/19/lenders-deny-mortgages-for-blacks-at-a-rate-80percent-higher-than-whites.html.

[13] https://cpb-us-e1.wpmucdn.com/sites.suffolk.edu/dist/3/1172/files/2014/01/Rice-Swesnik_Lead.pdf.

[14] https://www.science.org/doi/10.1126/science.aax2342.

[15] https://www.aeaweb.org/articles?id=10.1257/app.20160213.

[16] https://www.nber.org/system/files/working_papers/w22776/w22776.pdf.