AI + Tenant Screening

A PDF of the resource is available here.

A one-pager of the resource is available here.

An overview of threats to civil rights from algorithmic tenant-screening systems

As the demand for housing continues to rise, landlords are increasingly taking advantage of artificial intelligence (AI) to review applications from prospective tenants. These AI tenant-screening systems are already being used to process rental applications; they’re running criminal background checks, assessing eviction history, and pulling credit scores.[i]

And just like other AI systems, these automated tenant screening systems can discriminate against people in a number of ways:

1. AI tenant-screening systems can’t understand context, weigh individuals’ circumstances, or account for systemic discrimination.

Using AI exacerbates an existing problem with tenant-screening policies: blanket policies, like wholly refusing to rent to applicants with an eviction or criminal record, ignore the complexities of real-world circumstances.

In the case of eviction records, it’s important to know that landlords can file to evict a tenant for any reason, lawful or not, and some landlords use eviction filings to unjustly threaten, punish, or exploit tenants.[ii] In other cases, tenants dealing with landlords refusing to address code violations are regularly advised to withhold rent until their landlord either fixes the problem or files to evict them (and gives the tenant an opportunity to convince a judge to compel the landlord to fix the problem).[iii] Even if the filing is dismissed by a court in this situation, the eviction filing can remain on court records and appear in tenants’ background checks for years.[iv]

In the case of criminal records, AI tenant-screening systems are regularly programmed to deny anyone with a criminal record, regardless of key factors like whether the individual was convicted, the severity of the crime, whether the crime was violent, when the offense occurred, or if the sentence has been completed.[v]

Given that marginalized communities are drastically overrepresented in both criminal and eviction court systems, these automated systems can worsen existing disparities.[vi] The Department of Housing and Urban Development (HUD) has acknowledged that these systems, if left unregulated, will discriminate against people of color and people with disabilities in ways that are illegal under disparate impact liability (a legal concept that gives people the the grounds to sue if they can prove the impact of a decision or policy was discriminatory, even if they can’t prove the policy was intended to be discriminatory).[vii],[viii] Unfortunately, even though the Supreme Court has recognized the legality of disparate impact since the 1970s, the Trump administration’s Department of Justice has reversed course, rescinding all disparate regulations and ceasing all enforcement efforts.[ix],[x]

Consider a real-world example of disparate impact: Mikhail Arroyo, a young man with disabilities, was denied the opportunity to move into the same building as his mother, Carmen—as his conservator, she applied for him to live nearby so that she could be his caretaker while he maintained some independence. But the application was denied after CrimSafe, a tenant-screening system, flagged a criminal record. But Arroyo had never been convicted of a crime: he had previously been charged with theft, but that charge had been dismissed. [xi],[xii]

2. AI tenant-screening systems can make confident mistakes — authoritatively presenting inaccurate, incomplete, or outdated data as legitimate.

Just like other AI systems, AI tenant-screening systems are prone to confident mistakes. Researchers have identified hundreds of federal lawsuits in which tenant-screening systems have incorrectly matched an applicant to someone else’s records, falsely assigning criminal histories, eviction cases, credit reports, and outstanding debts to people who have similar names.[xiii]

AI tenant-screening systems have also been known to miss important data. One company offering these services, SafeRent Solutions, settled a $2.275 million lawsuit in which plaintiffs alleged that the AI system failed to account for applicants’ housing vouchers, automatically denying them based on their ability to pay without the voucher.[xiv] One of the 400 plaintiffs in this case, Mary Louis, was denied despite a recommendation from her previous landlord of 17 years, a housing voucher to help her pay, and her cosigner’s high credit score. SafeRent still determined her credit score was too low, and Louis had to move into a more expensive apartment.

Despite the unreliability of these systems, more than 90 percent of landlords require applicants to submit to tenant screening systems.[xv] Landlords often take the results of these screenings at face value, not understanding these systems’ fallibility.[xvi] And unfortunately for tenants, challenging the results of these background checks is difficult, when it is even possible.[xvii]

3. Current laws to protect prospective renters against discrimination, disparate impact, and errors made by tenant-screening systems weren’t made with AI in mind.

In 2023, the Office of Fair Housing and Equal Opportunity at HUD issued guidance on applying the Fair Housing Act to AI tenant-screening tools.[xviii] This guidance was issued in recognition that existing fair housing law was not developed with AI in mind, and that courts have struggled to apply civil rights laws such as the Fair Housing Act and Fair Credit Reporting Act to purportedly “neutral” AI tools that have nevertheless produced disparate outcomes. Companies that make AI tenant-screening systems have argued that they are not housing providers and are thus not subject to the Fair Housing Act, but housing advocates contend that as agents of landlords and property management companies, these companies can also be found liable.[xix] While the 2023 guidance from HUD agrees with advocates, it is only guidance—not law—and is unlikely to be followed by the Trump administration that has already issued a rule abandoning disparate impact liability.[xx]

Potential Solutions

In order to get to the root of the threats posed to civil rights by AI tenant-screening systems, it is essential that advocates push both companies and congress to act in our best interest.

Influencing Company Behavior

In May of 2025, The Leadership Conference’s Center for Civil Rights and technology released the Innovation Framework, a new guiding document for companies that invest in, create, and use AI, to ensure that their AI systems protect and promote civil rights and are fair, trusted, and safe for everyone, especially communities historically pushed to the margins.[xxi] It consists of four Foundational Values for managing long-term business strategy and decisions and ten specific Lifecycle Pillars aligned with the AI development and deployment pipeline to ensure these values are implemented in practice and that the technology truly works. If implemented by companies that develop AI-powered screening systems and the landlords and property managers who use them, the Innovation Framework can help protect prospective renters against wrongful denials. Even without a federal standard, companies can choose to act now to protect people against biased AI, especially when it comes to housing access.

Comprehensive AI Legislation and Regulation

In order to protect tenants from the harms of AI, any comprehensive federal AI legislative package must include specific and meaningful civil rights safeguards and privacy protections.

In addition to a comprehensive federal standard that is civil-rights protective, Congress should also pass the AI Civil Rights Act, a bill reintroduced in the 199th Congress by Senator Edward J. Markey and Congresswoman Yvette D. Clarke that would ensure that no matter how AI develops or what sectors AI tools are used in, our civil rights would be protected.[xxii] Specifically, the bill would mean that companies couldn’t create or use algorithmic tools that use our data to discriminate against us; it would require that AI tools be tested, transparent, and accountable; and would allow individuals to sue when harmed by AI tools.

Finally, sector-specific legislation and regulation that empowers HUD and the Consumer Financial Protection Bureau to investigate and regulate algorithmic products could also be powerful tools to protect people from discrimination in the rental economy.


[i] Tech Equity Collaborative. The Promise and Perils of Residential Proptech: Year 1 Research Summary Report.” Tech, Bias, and Housing Initiative. April 2023. https://techequity.us/wp-content/uploads/2023/04/TBHI-Y1-Research-Summary-Report.pdf

[ii] Pappoe, Yvette NA. “The Scarlet Letter “E”: How Tenancy Screening Policies Exacerbate Housing Inequity for Evicted Black Women.” Boston University Law Review. February 2023. Vol. 103(1), Pgs. 269-310. https://www.bu.edu/bulawreview/files/2023/04/PAPPOE.pdf

[iii] Leiwant, Matthew Harold. “Locked Out: How Algorithmic Tenant Screening Exacerbates the Eviction Crisis in the United States.” Georgetown Law Technology Review. 2022. Vol. 6, Pgs. 277-299. https://georgetownlawtechreview.org/locked-out-how-algorithmic-tenant-screening-exacerbates-the-eviction-crisis-in-the-united-states/GLTR-02-2022/

[iv] Eisenberg, Alexa and Brantley, Kate. “Record Costs: Collateral Consequences of Eviction Court Filings in Pennsylvania.” University of Michigan Housing Solutions for Health Equity. July 2024. https://www.urbanh3.com/_files/ugd/9d463d_6517025d2feb407f86473c5006da1484.pdf

[v] Karpinski, Lauren. “The Discriminatory Impacts of AI-Powered Tenant Screening Programs.” Georgetown Journal on Poverty Law and Policy. July 12, 2025. https://www.law.georgetown.edu/poverty-journal/blog/the-discriminatory-impacts-of-ai-powered-tenant-screening-programs/

[vi] Humber, Nadiyah J. “A Home for Digital Equity: Algorithmic Redlining and Property Technology.” California Law Review. October 2023. Vol. 111, Pgs. 1421-1484. https://www.californialawreview.org/print/a-home-for-digital-equity

[vii] “HUD Memo: Criminal Background Screenings May Violate Fair Housing Act.” National Low Income Housing Coalition. July 21, 2022. https://nlihc.org/resource/hud-memo-criminal-background-screenings-may-violate-fair-housing-act

[viii] Bains, Chiraag. “The legal doctrine that will be key to preventing AI discrimination.” Brookings. September 13, 2024. https://www.brookings.edu/articles/the-legal-doctrine-that-will-be-key-to-preventing-ai-discrimination/

[ix] Bains, Chiraag. “When Machines Discriminate: The Critical Role of Disparate Impact in AI Accountability.” The Leadership Conference’s Center for Civil Rights and Technology. January 22, 2026. https://civilrights.org/disparate-impact-ai/.

[x] US Department of Justice. “Rescinding Portions of Department of Justice Title VI Regulations To Conform More Closely With the Statutory Text and To Implement Executive Order 14281.” The Federal Register. December 10, 2025. https://www.federalregister.gov/documents/2025/12/10/2025-22448/rescinding-portions-of-department-of-justice-title-vi-regulations-to-conform-more-closely-with-the

[xi] Cohen Milsten. “Connecticut Fair Housing Center and Carmen Arroyo v. Corelogic Rental Property Solutions, LLC.” April 24, 2018. https://www.cohenmilstein.com/wp-content/uploads/2023/07/CoreLogic-Complaint-04242018_0.pdf

[xii] Lecher, Colin.“Automated background checks are deciding who’s fit for a home.” The Verge. February 1, 2019. https://www.theverge.com/2019/2/1/18205174/automation-background-check-criminal-records-corelogic

[xiii] Kirchner, Lauren and Goldstein, Matthew. “Access Denied: Faulty Automated Background Checks Freeze Out Renters.” The Markup and The New York Times. May 28, 2020. https://themarkup.org/locked-out/2020/05/28/access-denied-faulty-automated-background-checks-freeze-out-renters

[xiv] Cohen Milstein. “Louis, et al. v. SafeRent Solutions, et al.” November 24, 2024. https://www.cohenmilstein.com/case-study/louis-et-al-v-saferent-solutions-et-al/

[xv] TransUnion. “Landlord Survey: Optimism In Renting Your Property.”June 6, 2017. https://www.mysmartmove.com/blog/transunion-landlord-survey-summary

[xvi] So, Wonyoung. “Which Information Matters? Measuring Landlord Assessment of Tenant Screening Reports.” Housing Policy Debate. Vol. 33(6), Pgs. 1484–1510. https://www.tandfonline.com/doi/full/10.1080/10511482.2022.2113815

[xvii] Ibid. See Footnote 8.

[xviii] U.S. Department of Housing and Urban Development Office of Fair Housing and Equal Opportunity. “Guidance on Application of the Fair Housing Act to the Screening of Applicants for Rental Housing.” April 29, 2024. https://archives.hud.gov/news/2024/FHEO_Guidance_on_Screening_of_Applicants_for_Rental_Housing.pdf

[xix] Lawrence, Cheryl and Voz, Dominic. “Open Communities Reaches Resolution in Case Alleging AI Discrimination.”Open Communities. January 31, 2024. https://www.open-communities.org/post/press-release-open-communities-reaches-accord-in-case-addressing-artificial-intelligence-communicat

[xx] U.S. Department of Justice Office of Public Affairs. “Department of Justice Rule Restores Equal Protection for All in Civil Rights Enforcement.” December 9, 2025. https://www.justice.gov/opa/pr/department-justice-rule-restores-equal-protection-all-civil-rights-enforcement.

[xxi] The Center for Civil Rights and Technology at The Leadership Conference on Civil and Human Rights. “Innovation Framework: A Civil Rights Approach to AI.” https://innovationframework.org/

[xxii] “Sen. Markey, Rep. Clarke Reintroduce AI Civil Rights Act to Eliminate AI Discrimination and Enact Guardrails on Use of Algorithms in Decisions Impacting People’s Rights, Civil Liberties, Livelihoods.” December 1, 2025. https://www.markey.senate.gov/news/press-releases/sen-markey-rep-clarke-reintroduce-ai-civil-rights-act-to-eliminate-ai-discrimination-and-enact-guardrails-on-use-of-algorithms-in-decisions-impacting-peoples-rights-civil-liberties-livelihoods