by Gordon Hull
Judge Richard Posner’s well-known application of law and economics to privacy yields results that appear, well, ideological. First, he considers what individuals do with informational privacy. What is an interest in privacy of information, he asks? Well, it’s an interest in enforcing an information asymmetry in markets. Information asymmetry is presumptively bad because it causes distortion in the price mechanism; the price mechanism is in turn the reason that markets can claim to be both epistemically and normatively justified. They are epistemically justified because market price signals the social value of something much better than any sort of centralized planning process would do, and it does so without introducing all the inefficiencies of an enormous state apparatus. The price mechanism is normatively justified because it presents no special intrusion into the lives of individuals: we are all free to do what we want and signal (with our willingness to pay) what is important to us. In the case of privacy, for example, if I present myself or some good I am selling to you, “privacy” basically means that I’m trying to withhold relevant information about that good from you. If I apply for a job and hide a criminal record, then I’m trying to get you to overvalue me as a potential employee by keeping you ignorant of my past. Accordingly, the law should not protect such refusals to disclose, and in some cases ought to compel disclosure. Thus the first part of Posner’s article.
The trouble comes in the second part, where he then asserts that corporations ought to enjoy a greater right to privacy, because things like trade secrets and intellectual property encourage innovation. In his words:
“The law should in general accord private business information greater protection than it accords personal information. Secrecy is an important method of appropriating social benefits to the entrepreneur who creates them while in private life it is more likely to conceal discreditable facts” (404)
But now we have a couple of fairly obvious problems. First, where’s the comparison? Assuming that individuals use privacy to conceal discrediting information, we need to know whether the loss associated with this hiding is greater than whatever gain the privacy provides. Even from Posner’s paper, it’s not clear that’s the case, but he also stacks the deck by treating individuals and corporations differently (more about that in a minute). Conversely, do legal protections for corporate privacy like trade secrets encourage innovation? And, if they do, at what cost? Because IP involves monopoly pricing, some deadweight loss (people who want the product but not enough to pay the monopoly price, which means they don’t get the product and the producer doesn’t get a sale) is inevitable, though economic theories of IP tend to downplay this question.
Whether or not intellectual property spurs innovation is a very, very difficult empirical question, and in this case it’s made more complex because trade secrets need to be compared with patents as drivers of innovation, and patents are often defended at least partly on the grounds that they require those seeking them to publicize their technique in return for the exclusive right to market it. There’s an emerging literature that strongly suggests that, at least in some contexts, innovation happens either just fine or even better without IP protection (I am relying on the patents literature here, because I know it better than trade secrets, but the points I’m making should generalize since they are caused by the effects of information asymmetries, not their mechanism). The founding analysis is provided in this paper about the fashion industry, but there’s enough sector-specific work that Elizabeth Rosenblatt has attempted a theoretical account.
However these empirical questions turn out, however, there’s a more troubling asymmetry: why is it that individuals would only use privacy to hide information but not to innovate? After all, as Foucault notes, it is precisely theorists like Posner’s sometimes co-author Gary Becker who insist that individuals need to be understood as “entrepreneurs of themselves.” Not only that, there is now a literature that indirectly answers Posner on this point. For example, Julie Cohen argues that privacy is necessary for the sort of free play that enables innovation. Even if she is wrong about this point – and it’s controversial – the larger point that privacy is necessary for the development of the sort of subjectivity we value is made in a variety of places, and isn’t particularly novel (see, e.g., Jeffrey Reiman here, way back in 1995). Similarly, it’s not hard to find evidence that corporations use information privacy-based arguments to hide defects in their products; tobacco is only the most-discussed example.
But is there a smoking gun (outside the actual gun industry, which fights to suppress research into gun safety and pushes laws restricting doctor-patient speech about firearm safety)? In other words, is there a case where we can identify a structural problem associated with corporate privacy – and not just one that depends on malfeasance or cases analogous to hiding defects in one’s house? In his review of the informatics of health law and policy, Frank Pasquale makes the case that too much corporate privacy, in the form of siloed medical information, damages both public safety and innovation. Part of the reason for this arguably falls under the rubric of corporate malfeasance, such as the tendency of pharma to suppress research findings that would damage its products’ sales (see: Vioxx), or to define clinical trials in a way that artificially inflates safety (e.g., by excluding those who removed themselves from clinical trials because they were experiencing too many adverse effects) and efficacy (e.g., by comparing with a placebo, but not the current standard of care).
The more troubling aspects are driven economically: if we assume that massive availability of health data ought to be able to usable to influence health outcomes favorably (this may not turn out to be true, but it’s not an implausible thought), Pasquale suggests that the current regulatory landscape of corporate privacy stands directly in the way:
“By siloing data, health insurers and providers have impeded the types of large-scale analysis common in other industries. Providers have kept vital information about price, quality, and access secret to maintain a competitive advantage or hide shortcomings. For example, insurers keep secret many of the prices they pay. Each major drug company’s ‘data exclusivity’ may mean that rivals waste vast amounts of money pursuing leads that have already proven to be dead ends. Health information technology systems may not be interoperable, leaving them unable to ‘talk to one another’ and share data” (683)
The sort of argument here is analogous to concerns long expressed about pharma patents, which (whatever good they may do) also encourage quite a lot of waste, as research is directed toward ‘me too’ drugs (multiple remedies for erectile dysfunction or acne) rather than addressing unsolved problems (especially those that affect the poor; for papers making this sort of argument, see, e.g., here and here). So too, whatever is gained in innovation might be lost in a reduced rate of diffusion. More generally, they are reminiscent about the sorts of problems identified in the anti-commons literature, which explores the way that allocating too many property rights leads to a predictable under-utilization of resources because somebody always has the incentive to hold out for a use that would be better for them personally (so it’s the opposite problem of the tragedy of the commons). Even without the veto, too many exclusive rights increase the price of research by requiring multiple licenses.
Consider what Rosenblatt says about spheres that are likely to have low-IP protection:
“Negative space is likely to arise under the following four overlapping conditions: (1) when creation is driven by something other than exclusivity-based financial gain; (2) when granting exclusivity would significantly harm or deter other creation or innovation; (3) when public or creator interest in free access to creations exceeds the risks of diminished exclusivity incentives; and/or (4) when creators prefer to reinvest in creation or innovation rather than investing in protection or enforcement of intellectual property” (322).
In the case of health data informatics, where we have high levels of corporate privacy, it seems that considerations of efficiency dictate that things ought to be different, and the space for corporate privacy should be more negative. Following Rosenblatt’s schema (but treating it normatively), note that (1) makes sense in at least two ways: on the one hand, the positive externalities of better use of health data are presumably significant and could be incorporated into business models with the right regulatory structures; on the other hand, there are other market factors (like first-movers, or research subsidies (which Pasquale discusses in this context)) that could motivate innovation; (2) is represented by the siloing and anticommons problems; (3) is about getting the health benefits sooner and addressing problems of diffusion; and (4) ought to be a matter of what protections the regulatory schemas provide, and Pasquale makes the case that they could do better than it does on precisely this point.
So even if we assume the basic framework of Posner’s argument – information markets are good, innovation is good, efficiency is good, and so forth – it looks like the actual situation is not as he describes. In health informatics, at any rate, we need less corporate privacy. In the meantime, the way that internet disclosures can go viral, combined with the greater accessibility of damaging personal information like criminal records, and the growing ability to discriminate on the basis of speculative, risk-based data, mean that the personal costs to a lack of privacy can be greater than they were in an era of small data and small-town gossips.
In this case, in short, Posner has it backwards.