New Housing Regulation Limits Disparate Impact Housing Claims Based on Algorithms

20 10 2020
Sept. 29, 2020
Individuals alleging that a landlord discriminated against them by using a tenant-screening algorithm will face a higher burden of proof under a new rule that went into effect last Thursday. The rule creates a defense to a discrimination claim under the Fair Housing Act where the “predictive analysis” tools used were not “overly restrictive on a protected class” or where they “accurately assessed risk.” Last October, EPIC and several others warned the federal housing agency that providing such a safe-harbor for the use of algorithms in housing without imposing transparency, accountability, or data protection regulations would exacerbate harms to individuals subject to discrimination. The agency did modify its rule following comments from EPIC and others, removing a complete defense based on use of an “industry standard” algorithm or where the algorithm was not the “actual cause” of the disparate impact. But the final rule simply replaces the word “algorithm” with “predictive analysis” and includes vague “overly restrictive” and “accurate assessment” standards. The Alliance for Housing Justice called the rule “a vague, ambiguous exemption for predictive models that appears to confuse the concepts of disparate impact and intentional discrimination.”

The content in this post was found at:


Clicking the title or link will take you to the source of the post and was not authored by the moderators of