By Gordon Hull
I’ve written about the importance of Illinois’ Biometric Information Privacy Act (BIPA) before (see also here). Briefly, BIPA is the most important and powerful of the (relatively few) state laws designed to protect biometric privacy. The statute establishes a notice-and-consent regime (sigh. better than nothing, though N&C doesn’t work well, and is disturbing as a norm) for private parties that collect biometric information like face scans, establishes the need for data retention policies, establishes a private right of action (individuals can sue; other states make you go through the state attorney general) and establishes a statutory harm – as underlined by the Illinois Supreme Court, violating the statute is enough to collect damages.
Companies like Facebook have been fighting BIPA hard, because it’s bad for their business model, and some of the main litigation has been around Facebook’s phototagging feature. A lot of the issue has been about standing – whether aggrieved parties have the right to sue. Standing sounds simple: in order to have standing under Article 3 of the Constitution, three conditions need to be met. Going through them, however, will indicate why this is harder than it looks.
- First, the plaintiff must have suffered an “injury in fact,” an invasion of a legally protected interest which is (a) concrete and particularized, and (b) actual or imminent, not “conjectural” or “hypothetical.”
- Second, there must be a causal connection between the injury and the conduct complained of, the injury has to be fairly traceable to the challenged action of the defendant, and not the result of the independent action of some third party not before the court.
- Third, it must be "likely," as opposed to merely "speculative," that the injury will be "redressed by a favorable decision."
To make matters more complicated, the Supreme Court has been throwing up barriers to standing in data cases. The most immediately relevant is Spokeo v. Robins (2016). In that case, Robins argued that Spokeo’s profile inaccurately (and thus actionably under the Fair Credit Reporting Act) “state[d] that he is married, has children, is in his 50’s, has a job, is relatively affluent, and holds a graduate degree.” Justice Alito’s opinion emphasizes that standing requires demonstrating an injury that is both “concrete and particularized” (that's the first of the three) and sends the case back to the 9th Circuit on the grounds that it only considered particularity but not concreteness. That is, Robins had demonstrated that the inaccurate information was specific to him, but the Court had not considered whether he demonstrated more than a non-actionable, “bare procedural” violation of the law. As an example of the latter, Alito suggests an inaccurately reported zip code, because “it is difficult to imagine how the dissemination of an incorrect zip code, without more, could work any concrete harm” (p. 1550).
The 9th Circuit did eventually find that Robins met the standard, but Alito’s opinion is illustrative of a problematic skepticism about standing in data breach cases. Dan Solove and Danielle Citron have developed a detailed theory of why data breach harms should be actionable; not only are fear and anxiety real things, they are actionable in other contexts. Even in the limited context of Spokeo, Justice Ginsburg’s dissent points to problems: inaccurate credit information is more than a zip code (the 9th Circuit picked up on precisely this point: “given the ubiquity and importance of consumer reports in modern life—in employment decisions, in loan applications, in home purchases, and much more—the real-world implications of material inaccuracies in those reports seem patent on their face” (p. 1114)). So too, it is not hard to imagine how an inaccurately reported zipcode might cause concrete harm, because it could negatively affect one’s credit score and thus job prospects.
Anyway, back to BIPA. A lot of the litigation around BIPA has revolved around the first condition for standing, whether the harms caused by its violation are actual or immanent, concrete, etc. After all, the harms of a data breach are both futural and intangible, and potentially untraceable. This is why BIPA established a statutory harm, but it remained to be litigated whether the statutory harm was merely procedural. After several failed attempts to argue that plaintiffs lacked standing under Spokeo, Facebook finally settled for over $500 million in January, 2020.
Thus for part of the state of litigation around BIPA Section 15(b), the informed consent provision. But what about section 15(a), which requires a private entity in possession of biometric data “to develop, publicly disclose, and implement a retention schedule and guidelines for destroying the data when the initial purpose for collection ends?” In November, the 7th Circuit added a few more teeth to the law in Fox v. Dakkota. Raven Fox worked for Dakkota, which had a handprint screening system for checking into work. After she left, she alleged a BIPA violation because Dakkota had no data retention policy. What, in other words, were they going to do with her hand prints?
Dakkota relies primarily on two prior cases. Miller v. Southwest Airlines, although ultimately ruling the BIPA was pre-empted by federal labor law (it involved unionized airline employees), established that the use of fingerprinting for employee identification was a material change in employment conditions, and thus sufficiently concrete to confer standing. The more difficult precedent was Bryant v. Compass Group:
“Christine Bryant sued Compass Group USA, Inc., the owner and operator of "Smart Market" vending machines located in her workplace cafeteria. The vending machines did not accept cash. Instead, customers established a user account by scanning their fingerprints and setting up a payment link; they could then make purchases using a fingerprint scanner on the machines. Bryant voluntarily set up an account and regularly made purchases from the vending machines. She filed a proposed class action in state court accusing Compass Group of two BIPA violations: it "never made publicly available" a data-retention schedule and data-destruction guidelines, violating section 15(a), and it never obtained her informed consent in writing, violating section 15(b).” (Fox v. Dakkota, 15-16, internal citations omitted).
The Bryant court awarded standing on the informed consent procedures, but not on data retention. On the latter point, emphasizing the narrowness of its ruling, the Court concluded that:
“We explained that "the duty to disclose [data-retention policies] under section 15(a) is owed to the public generally, not to particular persons whose biometric information the entity collects." And because Bryant "allege[d] no particularized harm that resulted from Compass [Group's] violation of section 15(a)," we concluded that she lacked Article III standing to pursue that claim in federal court.” (Fox v. Dakkota, 17).
The Fox ruling is therefore significant, because it establishes that a failure to have a data retention policy at all is sufficient to establish standing under BIPA.
The Fox Court ruled:
“Fox … does not allege a mere failure to publicly disclose a data-retention policy. She accuses Dakkota of violating the full range of its section 15(a) duties by failing to develop, publicly disclose, and comply with a data-retention schedule and guidelines for the permanent destruction of biometric data when the initial purpose for collection ends. That violation, she alleges, resulted in the unlawful retention of her handprint after she left the company and the unlawful sharing of her biometric data with the third-party database administrator.”
In other words, failure to comply with a policy requirement is far worse than complying and failing to disclose. The Court continues, establishing analogy between retention and collection of data:
“An unlawful retention of biometric data inflicts a privacy injury in the same sense that an unlawful collection does. Just as section 15(b) expressly conditions lawful collection of biometric data on informed consent, section 15(a) expressly conditions lawful retention of biometric data on the continuation of the initial purpose for which the data was collected. The BIPA requirement to implement data retention and destruction protocols protects a person's biometric privacy just as concretely as the statute's informed-consent regime. It follows that an unlawful retention of a person's biometric data is as concrete and particularized an injury as an unlawful collection of a person's biometric data. If the latter qualifies as an invasion of a "private domain, much like an act of trespass would be," then so does the former” (18-19, internal citations omitted).
This strikes me as important for two reasons. First, it continues the welcome trend of federal courts awarding standing in BIPA cases. The more defendants that are forced to answer BIPA cases, rather than argue that a violation of BIPA isn’t actually an injury, the more seriously they will take it. That’s good for everyone.
Second, it makes good sense as privacy theory. If we analogize privacy to trespass, as the court does here, then it makes no conceptual sense to say that somebody’s failure to leave my property is somehow a lesser offense than their initial trespass. Their failure to leave is the continuation of that initial offense. Indeed, the failure to leave is arguably the worse offense, as reflected in the fact that it’s very hard to prosecute trespass if the offender leaves the property when told to. If the injury their trespass occasions is that somebody might see them there (I know I’m straining, but this is sort of like data breach in this context), then the longer they stay the greater the risk of harm. Beyond the trespass analogy, recall that the risks associated with biometric privacy violation – risks that the BIPA legislature explicitly considered – are in part because of the difficulty in remedying them. If your credit card number is stolen, it’s easy enough to cancel and issue a new number. If your handprint is stolen, it’s very difficult to get a new handprint! In other words, the risks of biometric privacy invasion are quite serious.
It works for other privacy theories, too. Helen Nissenbaum’s work on privacy as contextual integrity views information privacy through the lens of norms of information flow. One of the norms from interpersonal information is that you forget it after a while. I see lots of people’s faces, but there’s relatively few that I can reliably associate with names. This norm is obviously under some strain given the ability of computers to retain information longer than people (it’s hardly a new issue, in other words), but it’s reasonable to think that part of my giving sensitive information to someone – they don’t just see my face, but get something analogous to a handprint or a retinal scan – is a sense that they won’t retain it longer than necessary. And if there is a sense that they will remember it forever, I am going to be very, very selective as to whom I disclose it. In other words, the widespread use of biometric data requires some sort of retention/deletion policy with teeth to come within striking distance of norms about information flow.
On the other hand, if one views privacy as about establishing social trust, as Ari Ezra Waldman does, then it seems clear that we are unlikely to build trust in biometric identification practices unless we trust that the data will be handled and disposed of correctly. This is an issue: recent survey data shows that people do not trust biometric data collection. When surveyed about their general comfort level with biometric information collection by corporate entities, strong majorities (74.6%) of respondents reported being at least somewhat uncomfortable. People’s reasons included that it felt invasive for a company to collect and share information (70%), concern about where this information might lead in the future (67%), concern about public tracking (63%), that it increases the risk of identity theft (53%) and that it could be used to find out other things about them (41%) and that the information was part of them (31%) and cannot be changed (22%).
Finally, if privacy is somehow about power in the Foucauldian sense, then establishing norms according to which corporate entities are accountable for their data practices is a minimal, first baby-step in pushing back against corporate data capitalism.
All in all, a positive step forward.
Recent Comments