Let’s say the state passes a law that says that restaurants may not put worms in hamburgers, and that customers can sue those that do. Your kids eat at the local Annelids franchise on the way home from school, and you later discover that the burgers contain worms. You sue the restaurant, and in response, it says that since your kids didn’t get sick from the experience, it only “technically” violated the law, and you therefore have no basis to either collect damages or force them to stop. That is, on the face of it, a pretty stupid construction of the law, since it would make it totally unenforceable. It is also, more or less, how Facebook, Google, and Six Flags Great America think you should read the Illinois Biometric Privacy Information Act (BIPA).
In a unanimous ruling yesterday, the Illinois Supreme Court said otherwise. The Act regulates companies that collect biometric information like fingerprints and face scans. In the case at hand, the amusement park used thumbprints to identify customers who had bought a season pass, in order to admit them quickly and to stop people from sharing passes (you know the second reason is the real one; since when do amusement parks worry about long lines?). If the company wants to collect such information, it has to inform customers what they’re doing in writing, and get affirmative consent. In particular, the company must “inform[] the subject or the subject’s legally authorized representative in writing of the specific purpose and length of term for which a biometric identifier or biometric information is being collected, stored, and used” (cited in para 20).
This is notice and consent privacy as it should be! There’s a lot of reasons why I don’t think notice and consent privacy actually works – I called it a “successful failure” because the main effect is to teach us that privacy is something we should sell off but without actually protecting privacy. Nobody knows what they’re consenting to, the privacy notices are impossible and long and incomprehensible, and so on. Six Flags et al don’t even get that far: they argue that they don’t actually have to follow the law. So for them, privacy should just be a failure.
BIPA stipulates that an aggrieved party can collect statutory damages and or injunctive relief. Alexander Rosenbach’s mother signed him up for a season pass to Six Flags online, in anticipation of a school field trip. Once he got there, Alexander was shuttled to the kiosk where the park completed the process, which included thumbprinting him. The park did not provide the required documentation anywhere in the process. His mother sued under the statute; Six Flags responded that although they violated the statute w/r/t her son, nonetheless he is not “aggrieved” under the law. Per the court:
“Rosenbach seeks redress on her son’s behalf and on behalf of a class of similarly situated individuals based solely on defendants’ failure to comply with the statute’s requirements. In defendants’ view, that is not sufficient. They contend that an individual must have sustained some actual injury or harm, apart from the statutory violation itself, in order to sue under the Act. According to defendants, violation of the statute, without more, is not actionable” (para 22).
We should pause to note the staggering hubris of the claim. It is not that there was a long privacy notice that Rosenbach failed to read; no, it is that the law is unenforceable. The Illinois Supreme Court demolishes this nonsense as a matter of statutory interpretation:
“The Act vests in individuals and customers the right to control their biometric information by requiring notice before collection and giving them the power to say no by withholding consent. These procedural protections “are particularly crucial in our digital world because technology now permits the wholesale collection and storage of an individual’s unique biometric identifiers - identifiers that cannot be changed if compromised or misused.” When a private entity fails to adhere to the statutory procedures, as defendants are alleged to have done here, “the right of the individual to maintain [his or] her biometric privacy vanishes into thin air. The precise harm the Illinois legislature sought to prevent is then realized.” This is no mere “technicality.” The injury is real and significant. …. The situation is particularly concerning, in the legislature’s judgment, because “[t]he full ramifications of biometric technology are not fully known.”
“The strategy adopted by the General Assembly through enactment of the Act is to try to head off such problems before they occur. It does this in two ways. The first is by imposing safeguards to insure that individuals’ and customers’ privacy rights in their biometric identifiers and biometric information are properly honored and protected to begin with, before they are or can be compromised. The second is by subjecting private entities who fail to follow the statute’s requirements to substantial potential liability, including liquidated damages, injunctions, attorney fees, and litigation expenses “for each violation” of the law, whether or not actual damages, beyond violation of the law’s provisions, can be shown.
“The second of these two aspects of the law is as integral to implementation of the legislature’s objectives as the first. Other than the private right of action authorized in section 20 of the Act, no other enforcement mechanism is available. It is clear that the legislature intended for this provision to have substantial force. When private entities face liability for failure to comply with the law’s requirements without requiring affected individuals or customers to show some injury beyond violation of their statutory rights, those entities have the strongest possible incentive to conform to the law and prevent problems before they occur and cannot be undone. Compliance should not be difficult; whatever expenses a business might incur to meet the law’s requirements are likely to be insignificant compared to the substantial and irreversible harm that could result if biometric identifiers and information are not properly safeguarded; and the public welfare, security, and safety will be advanced. That is the point of the law. To require individuals to wait until they have sustained some compensable injury beyond violation of their statutory rights before they may seek recourse, as defendants urge, would be completely antithetical to the Act’s preventative and deterrent purposes” (para 34-7).
Six Flags is not the only company in potential trouble over BIPA; Facebook is involved in parallel litigation in federal court over facial recognition, and is making the same argument. Fortune reports that that the total damages (either $1000 or $5000 per violation, depending on the nature of the violation) could be in the billions of dollars, and cites an industry-friendly attorney who proposes that the statutory damages are “so ruinous and out of proportion with any technical violation as to be unconstitutional.”
Well, no. First, the idea that even a few billion dollars in fines would ruin Facebook or Google is absurd. Indeed, the companies’ incessant failure to follow privacy rules, and the experience of Europe in trying to enforce them with even large fines, suggests that only a penalty of that size would suffice to deter. Second, that argument studiously ignores that plaintiffs under BIPA are entitled to injunctive relief: they can force the company to stop the data collection. All this alarm about penalties sounds like a smokescreen to me: they don’t want anybody to be able to make them stop.
In any case, Facebook etc. are determined to fight BIPA tooth and nail. The Illinois case probably settles the “merely technical” violation (at least as a matter of state law; as the Fortune story notes, federal law is a mess on this, and the Supreme Court has been trying to make class certification harder). FB is meanwhile contesting whether its technology is a “scan … of face geometry” under the statute. Plaintiffs say that “the technology necessarily collects scans of face geometry because it uses human facial regions to process, characterize, and ultimately recognize face images.” FB replies with a big data two-step: the algorithm “learns for itself what distinguishes different faces and then improves itself based on its successes and failures, using unknown criteria that have yielded successful outputs in the past” (3). We don’t know if the algorithm discriminates scans faces: it’s a black box! (it’s not conscious; I said it’s a zombie, you doof!) The Federal court said this was a dispute of fact for a jury to handle.
There’s a lot to unpack here, but I’ll just conclude for now with the obvious point that this is going to be a long fight, and a whole lot depends on how it turns out. BIPA is a big deal, and it shows that those of us who have been worrying about the failures of notice and consent were apparently too optimistic.
Recent Comments