Civil Rights Principles for Hiring Assessment Technologies
View the full document as a PDF here.
Preface
Hiring is a critical gateway to economic opportunity, determining who can access employment to support themselves and their families. Technology is rapidly changing every step of the employment selection process. Employers are now using new assessment tools that rely on artificial intelligence and algorithms to screen and select job candidates. These developments highlight the need for policymakers to clarify and strengthen guardrails to ensure that hiring assessments are used equitably.
Hiring assessments can take many different forms, including video interviews, gamified assessments, and resume screens. Many vendors claim that these tools objectively evaluate applicants against a set of traits or a profile associated with success in, or good fit for, a job. Employers are using these assessments in an effort to increase efficiency by reducing the time and cost required to fill an open position with a successful hire, and sometimes to hire more diverse candidates. However, assessment technologies can invisibly automate large numbers of rejections by determining which applicants get serious consideration. As a result, hiring assessment technologies deserve close scrutiny.
Today’s workforce reflects decades of discrimination on the basis of race, ethnicity, sex (including sexual orientation and gender identity), disability and age, and other factors. Hiring assessment technologies are one of many barriers that may impede equity and inclusion in the workforce. Artificial intelligence, by its very nature, risks replicating and deepening existing inequities when it relies on data from the current workforce that is not sufficiently representative because of historic discrimination. Hiring assessment technologies must advance equity, not erect artificial barriers to employment. This will require proactive interventions by employers, vendors, and policymakers.
The following principles are offered to guide the development, use, auditing, and oversight of hiring assessment technologies, with the goals of preventing discrimination and advancing equity in hiring.
Principles
These principles should guide the development, use, auditing, and oversight of hiring assessment technologies. They both reinforce and strengthen existing legal obligations pertaining to the employment selection process. Employers already have significant obligations to administer nondiscriminatory hiring processes. Vendors and technology providers will need to play a more prominent role in helping ensure these obligations are met. Government must ensure robust enforcement of existing laws to promote these principles. And policymakers may need to develop new laws and guidance to ensure workers’ rights are protected.
As used below, hiring assessment technology refers to any assessment that relies in whole or in part on technology, including computer algorithms or statistical models, to help evaluate job applicants. Organization means any entity that creates, makes available, or uses hiring assessment technologies, including employers, employment agencies, hiring assessment developers and vendors, and companies that facilitate job-matching and recruitment.
1. Nondiscrimination
Hiring assessments should not discriminate based on protected characteristics such as race, color, ethnicity, religion, national origin, sex, gender identity, sexual orientation, age, familial status, disability, or genetic information.
- Hiring assessments built using historical data can reproduce patterns of systemic discrimination already present in the workforce. Machine learning algorithms can discover subtle correlations and proxies for protected characteristics, even when they are purposefully omitted from the model-building process.
Organizations should carefully test and scrutinize their hiring assessment models to ensure that they do not perpetuate discriminatory hiring patterns. Merely removing demographic data from the model-building process will not accomplish this goal.
- Hiring assessment technologies, including those using novel procedures like games or facial and voice analysis, can create new barriers for job applicants, including people with disabilities. Facial and voice analysis technologies, in particular, have been shown to be inaccurate for people of color, English speakers with non-native accents, and transgender, nonbinary, and gender nonconforming people.Organizations should ensure that hiring assessments are designed and administered in ways that do not exclude people, including those with disabilities, and provide reasonable accommodations. Because there are limited data about the diverse dimensions of disability, organizations cannot prevent discrimination by statistical auditing alone. Organizations must also ensure that technologies do not unfairly discriminate against people of color, people with limited English proficiency, and transgender, nonbinary, and gender nonconforming people.
2. Job-Relatedness
Hiring assessments should measure traits and skills that are important to job performance. Assessments based on criteria that are unnecessary to job performance risk creating artificial or discriminatory barriers to employment opportunity.
- Some hiring assessment technologies use machine learning to identify traits that are statistically correlated with job performance in a particular population and environment, regardless of their actual relation to job requirements. Statistical modeling cannot substitute for a rigorous job analysis.Organizations should do the necessary work of studying and understanding the knowledge, skills, and abilities required by a particular job, and should not rely on hiring assessment technologies purporting to predict performance as a substitute for identifying job-related criteria. Moreover, mere correlations between traits and purported job performance should not be sufficient to justify adverse impact resulting from consideration of those traits.
- Machine-learning models often assess job applicants based on complex features that are not transparent or open to scrutiny by applicants, employers, or even the models’ developers.Organizations should be able to describe what an assessment is measuring and why, as well as show that the assessment is actually measuring what it purports to measure, before deploying that assessment.
3. Notice and Explanation
Applicants should be meaningfully notified about how they will be assessed so they can seek redress under existing civil rights protections or request a reasonable accommodation.
- Applicants often receive little explanation about how they will be evaluated by assessments or feedback about their performance.Organizations should provide applicants with meaningful disclosures about how assessments operate and inform employment decisions, including the information collected to evaluate job seekers. Under existing law, employers should provide sufficient information about the operation of assessments so that applicants can determine whether they need to seek reasonable accommodations, make clear the procedures for requesting reasonable accommodations, and ensure reasonable accommodation processes do not disadvantage applicants. All organizations should do the same.
- Hiring assessment technologies can be designed to automatically deliver meaningful feedback at scale. In other policy contexts, such as credit, federal law has recognized a right for consumers to receive adverse action notices that indicate standardized reasons for adverse decisions about them.Organizations should provide applicants with reasonable and timely feedback on their performance on a hiring assessment.
4. Auditing
Hiring assessments should be thoroughly and regularly audited before and after deployment for discrimination and job-relatedness. Organizations and policymakers may need to develop new technical and legal standards to ensure applicants are protected.
- Auditing machine learning models often requires documentation and retention of information such as training data, designs, applicant information, assessment criteria, assessment outputs, and ultimate hiring decisions.Organizations should retain and clearly identify the data necessary to regularly audit assessments for discrimination and job-relatedness, with appropriate privacy and data security protections.
- Statistical testing for disparate impacts based on race, ethnicity, gender, and age is critical but insufficient to prevent all forms of discrimination. For example, because people’s disabilities are so diverse, statistical tests are unlikely to reveal how a hiring assessment will impact any individual with a particular disability.Organizations must continually and holistically audit their hiring assessments. This means scrutinizing the content of hiring assessments, measuring their outcomes, and considering employers’ overall hiring processes.
- Though organizations should engage in rigorous self-testing of their own hiring assessment technologies before and after deployment, auditing is often best conducted by independent third parties who can provide a greater degree of impartiality and accountability.Organizations should work with third parties to routinely audit their assessments for discriminatory design and effects. Organizations should publicly disclose the methods and results of self-testing and third-party audits.
- Models that are developed and tested using vendors’ proprietary data and designs can have different results when used on actual applicants.
Once deployed, hiring assessments’ real-world performance must be continually audited for disparities.
5. Oversight and Accountability
Federal and state policymakers should develop new legal and technical standards, and equip state and federal regulators with the ability to meaningfully investigate and hold organizations accountable for ensuring equal opportunity in their use of hiring assessments.
- Applicants typically lack insight into hiring practices, including the design and impacts of hiring assessment technologies, and cannot effectively vindicate their rights alone.Regulators should have the legal mandates, resources, and expertise they need to conduct proactive oversight and enforcement, including the ability to request information about how organizations develop and use hiring assessments. Government employers’ procurement and use of hiring assessment technologies should be subject to approval and transparency. Effective oversight cannot rely on applicants knowing that they have been wronged and filing complaints.
- Today’s laws and regulatory guidance leave too much ambiguity for organizations, workers, and enforcement agencies trying to apply civil rights protections to modern hiring assessment technologies.
Policymakers should begin to consider new regulations and guidance interpreting civil and human rights laws in light of predictive hiring tools and encourage the development of third-party auditing standards. At a minimum, agencies should conduct and publish research on hiring assessments and civil rights, including a candid reflection on federal and state governments’ capacity to oversee the design and use of modern hiring technologies.
Signatories:
The Leadership Conference Education Fund
The Leadership Conference on Civil and Human Rights
ACLU
AI Now Institute
American Association of University Women (AAUW)
Center for Democracy & Technology
Center for Law and Social Policy (CLASP)
Center on Privacy & Technology at Georgetown Law
Color of Change
Lambda Legal
Lawyers’ Committee for Civil Rights Under Law
NAACP
NAACP Legal Defense and Educational Fund, Inc.
National Association of Councils on Developmental Disabilities
National Center for Law and Economic Justice
National Employment Law Project
National Employment Lawyers Association (NELA)
National Organization for Women
National Partnership for Women & Families
National Women’s Law Center
New America’s Open Technology Institute
Open MIC (Open Media and Information Companies Initiative)
Public Knowledge
UnidosUS
Upturn
Workplace Fairness