Skip to Content

AI Policy & Governance, Equity in Civic Technology, Privacy & Data

Tenant Screening Algorithms Enable Racial and Disability Discrimination at Scale, and Contribute to Broader Patterns of Injustice

Last fall, a federal district court judge agreed to hear a tenant’s case against a tenant screening company – the first of its kind. The plaintiff, Carmen Arroyo, had asked permission for her son Mikhail to move in with her after he became disabled in an accident and needed care. A tenant screening company denied this request after generating a report stating that it had found a “criminal court action” relating to Mikhail without any further information. Later, Carmen and Mikhail learned that the company had referred to records of his arrest and citation for a minor shoplifting accusation as a “disqualifying criminal record,” even though he was never convicted and the charge was ultimately withdrawn. Because Mikhail couldn’t move in with his mother, he was left in an institution for a year. In ongoing litigation, the company has claimed that it is not engaging in housing discrimination as Mikhail alleges, because it is not responsible for housing decisions – only landlords or property managers are.

This kind of automatic decision-making is not unique to Carmen and Mikhail. If you apply to rent an apartment today, an algorithm might decide whether or not you’ll be approved.

Landlords and property managers have not always used algorithmic tools to screen potential tenants. Instead, they’ve checked eviction records, criminal records, past landlord references, and sometimes credit scores – background checks that are already rife with potential for racial, class, and disability discrimination. Tenant screening is a subtype of credit reporting, subject to specific federal rules under the Fair Credit Reporting Act (FCRA) that apply both to landlords and tenant screening companies. Although it is not an antidiscrimination law, compliance with FCRA helps reduce discrimination as FCRA protections cover eviction records, credit reports, and criminal background checks, with additional requirements for investigative reports involving personal interviews. FCRA requires landlords or tenant screening companies to inform prospective tenants about any adverse decision made based on such reporting, and to provide a copy of the report if the tenant asks so the tenant can dispute inaccurate information. FCRA also requires a high degree of accuracy in such reports – for instance, if a person reports errors, the reporting agency must reinvestigate the information and correct any errors. 

Despite the existence of FCRA protections in theory, tenants still face substantial risks in practice. Today, with the aid of new algorithmic tools, landlords can send prospective tenants’ applications to automated systems that compare their data against millions of records – some incomplete, unreliable, or easily confused – with little to no opportunity for recourse, even if the law prohibits discrimination.

For people who need a place to live, the consequences of mismatched eviction, criminal, and default records can be severe. Denials based on credit score, eviction history, or criminal records could run afoul of the law in several ways:

  • Denials based on criminal records can unfairly deny housing, violate the law, and disparately impact people of color and disabled people. The U.S. Housing & Urban Development’s (HUD) guidance on the Fair Housing Act specifies that landlords cannot deny housing based solely on arrest records (rather than conviction records) or outright ban anyone with any criminal records. When a person does have a conviction on their record, a landlord must consider the actual facts surrounding the record on a case-by-case basis. People of color and people with disabilities are more likely to be targeted for mass criminalization, and therefore more likely to be affected by an algorithmic system that flags applicants with any criminal history. Additionally, incarceration – even pretrial detention without any conviction – can jeopardize someone’s ability to pay rent and lead to eviction or broken leases, both of which can become proxies for criminal history in an algorithmic system.
  • Denials based on recent evictions and poor rental payment history are legal, but can disproportionately deny housing to disabled people and people of color. FCRA permits tenant screening companies to report eviction and nonpayment records within the last seven years, but tenant screening companies and landlords might include older information that can result in unfairly denied applications. Disabled people and people of color experience poverty and unemployment at higher rates than nondisabled or white people. People with unstable or limited income are also less likely to be able to make consistent rental payments, and have a harder time securing housing with landlords skeptical of rental subsidy programs. This can mean that disabled people and people of color are less likely to have a stable rental history or proof of sufficient funds or income.
  • Denials based on characteristics or experiences associated with survivors of domestic violence can also violate the law. People of color and disabled people are more likely to be victimized by family violence or intimate partner abuse. People who are subjected to domestic violence may have trouble leaving an abusive situation due to lack of control over or access to finances, and difficulty building up independent credit. These issues are worse for disabled people who might rely on an abuser for access to housing or health care. As a result, algorithmic systems that report more positively on people with higher credit scores or longer rental history could inadvertently penalize disabled victims or survivors of domestic violence.

    Additionally, HUD guidance stipulates that landlords or property management companies that use nuisance laws to evict victims and survivors of domestic violence can amount to illegal discrimination based on sex or familial status. Such laws can penalize people for making frequent emergency calls, which can discourage victims and survivors from seeking assistance, and punish those who do.

As a result of these disparities, information about past arrests, evictions, or defaulted loans can thus be proxies for race and disability. Our community members are at higher likelihood for disruptions to income and housing stability, which can directly impact eligibility for future rental applications.

These types of surveillance technologies are not limited to tenant screening. Landlords and property management companies also deploy surveillance technologies to control what tenants do once on the property, automate eviction proceedings, and suppress rent strikes and tenant organizing. Despite the eviction moratorium currently in place, expanded use of these technologies can further and deepen economic and housing injustices that already disproportionately impact disabled people, poor people, and people of color.

Policymakers must carefully limit and regulate the use of housing algorithms because of their significant potential for harm in marginalized communities. Regulators must ban the use of automated decision-making systems in rental applications, and require landlords and tenant screening companies to make and provide individualized decisions when considering a prospective tenant’s application. FCRA already requires specific disclosures; landlords and tenant screening companies must also provide explicit notice of any use of computerized or algorithmic systems, and may not use such systems unless they can provide only the information permitted to be disclosed within FCRA’s limits. These protections would help ensure that tenants do not face adverse decisions based on information landlords shouldn’t consider, and that tenants can expect better access to housing as a result.