It appears that HUD, as part of the general initiative to stop enforcing the housing laws it’s supposed to enforce, is poised to allow landlords to hide behind computer algorithms as they discriminate against minority tenants. As Andrew Selbst – who co-authored one of the foundational pieces on exactly this sort of problem – describes the proposed rule change in Slate:
“The proposal, while billed as a mere update to bring regulations in line with the Supreme Court’s 2015 decision, creates entirely new rules for landlords using algorithms. There are functionally two separate defenses being created. First, the proposal allows landlords to use an algorithm as long as its inputs are not “substitutes or close proxies” for protected characteristics and as long as it’s predictive of what it purports to predict—or a “neutral third party” certifies that fact. So if a hypothetical landlord decides to predict number of noise complaints as a proxy for difficult tenants, using music streaming data they somehow obtained, they might find a correlation between preferred musical genre and how difficult a tenant is. Of course, musical preference is not a substitute or close proxy for race, but an algorithm that equates a preference for hip-hop with noise complaints is probably picking up on race as a factor in frequency of noise complaints. Unlike under existing law, under this rule, a landlord would be off the hook even where there may be less discriminatory alternatives. Second, the landlord is also immunized from any discrimination claim if he uses a tool developed and maintained by a recognized third party. These safe harbors supposedly ensure that a model is legitimate and the landlord has not himself “caused” the discriminatory outcome”
A brief discrimination backgrounder: legally, discrimination can be on the basis of disparate intent, or disparate impact. Disparate intent is what it sounds like: I discriminate against black tenants if I decide not to rent to any, because they are black. As you can imagine, this sort of discrimination is hard to prove, because not so many people are dumb enough to advertise it. The more standard strategy is to mask discrimination behind proxies that name race (or some other protected category) without actually naming it. Given residential segregation, for example, zip code is a pretty decent proxy for race, and so a landlord might systematically disfavor applicants whose previous address is in a specific zip code. That’s also not ok, because although the rule doesn’t (directly) “intend” to discriminate, the impact of this policy is disparate between racial groups, and so is the same as if it did.
Recent Comments