A recent paper by Hamid Ekbia presents an interesting Marxian theory of the relation between exploitation and computer networks. The paper is intended as an intervention in to discussions of the accumulation of value in what is now called cognitive capitalism (I’ve attempted to synthesize some of that literature here). The most interesting part of the Ekbia’s paper seems to me that he’s able to construct a coherent notion of class (or close to class – he acknowledges that it’s not quite a class in the strict Marxian sense) within those who are part of the networked economy. In particular, he is able to locate those who are exploited and to roughly define them as a group: the “condensers.” The problem of locating a specific exploited class is important and salient partly because there is no way for a Marxian theory of value to work unless somebody is exploited, but also because the behavior of prosumers in particular has been the subject of intense controversy, particularly on the subject of whether they produce value. Ekbia’s contribution, it seems to me, is to show how and why some prosumers manage to be exploited.
There were some interesting cases from the Supreme Court yesterday. No, not gay marriage or Obamacare. But the Court ruled in favor of business privacy (against blanket government intrusion) and in favor of a jail inmate who had been badly handled by deputies. There’s also a potentially important regulatory takings case. I want to look at the first one for now. Los Angeles v. Patel involved an LA ordinance that required that hotel owners keep records of specified information about hotel guests, and that hotel owners must make these records “available to any officer of the Los Angeles Police Department for inspection” on demand. Several hotel owners sued, making a facial challenge to the ordinance on Fourth Amendment grounds. Today, the Court ruled (5-4, opinion by Sotomayor) that the statute was on its face unconstitutional because it provided no way to challenge an officer who showed up with a records demand.
As I’ve suggested here before, one of the undertheorized aspects of biopower is the relation between biopower and the juridical power it supposedly supplants. Now, I think it’s a mistake to think that biopower simply replaces juridical power, at least not on Foucault’s considered view (for the sorts of reasons given in papers such as this one; nor do I think the relation should be read that way, whatever Foucault thought), but to say that is to then pose a problem concerning their interrelations.
This paper by Jack Balkin (law, Yale) offers some help in disentangling the various threads. Balkin’s concern is to outline the features of what he calls the “national surveillance state,” which he proposes is our current mode of governance, having taken over and transformed the governmental apparatus from the mid-century Welfare and National Security states. The former developed through the implementation of New Deal programs, and the latter through the Cold War. The two of them together, plus developments in computing power, enable the surveillance state, which is a “way of governing” that has developed over the last half of the twentieth-century (and thus long predates 9/11 and its aftermath):
Judge Richard Posner’s well-known application of law and economics to privacy yields results that appear, well, ideological. First, he considers what individuals do with informational privacy. What is an interest in privacy of information, he asks? Well, it’s an interest in enforcing an information asymmetry in markets. Information asymmetry is presumptively bad because it causes distortion in the price mechanism; the price mechanism is in turn the reason that markets can claim to be both epistemically and normatively justified. They are epistemically justified because market price signals the social value of something much better than any sort of centralized planning process would do, and it does so without introducing all the inefficiencies of an enormous state apparatus. The price mechanism is normatively justified because it presents no special intrusion into the lives of individuals: we are all free to do what we want and signal (with our willingness to pay) what is important to us. In the case of privacy, for example, if I present myself or some good I am selling to you, “privacy” basically means that I’m trying to withhold relevant information about that good from you. If I apply for a job and hide a criminal record, then I’m trying to get you to overvalue me as a potential employee by keeping you ignorant of my past. Accordingly, the law should not protect such refusals to disclose, and in some cases ought to compel disclosure. Thus the first part of Posner’s article.
A couple of decades ago, I strolled through Washington Square Park on a warm summer night, idly observing the usual hustle and bustle of students, tourists, drunks, buskers, hustlers, stand-up comedians, and sadly, folks selling oregano instead of good-to-honest weed. As I did so, I noticed a young man, holding up flyers and yelling, 'Legalize Marijuana! Impeach George Bush! [Sr., not Jr., though either would have done just fine.]." I walked over, and asked for a flyer. Was a new political party being floated with these worthy objectives as central platform issues? Was there a political movement afoot, one worthy of my support? Was a meeting being called?
The flyers were for a punk rock band's live performance the following night--at a club, a block or so away. Clickbait, you see, is as old as the hills.
Cloud computing – where users keep their data (and often their applications) online - poses significant theoretical and regulatory problems. Many of these concern jurisdiction: it’s very hard to even know at a given moment where data is kept, and it’s often unclear (in the case of privacy, for example), which jurisdiction’s privacy and data protection rules should apply (the one for the data subject? the company that collected the data? the companies processing it? etc.). Not only that, U.S. and EU law are wildly inconsistent on the point, even though any large big data company has to serve multiple jurisdictions.
A recent piece by Paul M. Schwartz does some valuable work disentangling these issues; here, I want to focus on one moment. Schwartz notes that cloud computing will likely induce significant changes in how firms are structured, and how they structure their data handling. Back in 1937, Ronald Coase proposed that companies will decide between doing something in house and outsourcing it based on a comparison of the costs of each. If it’s more efficient to do something in-house, using the hierarchical control structure of the firm and avoiding the complexities of dealing with markets, that’s what we can expect. If, on the other hand, it turns out that it’s more efficient to hire somebody else to do the job, we can expect companies to do that. Companies have to balance the difficulties of managing a project in-house versus the costs of negotiating contracts with independent vendors.
To the long list of rich entities trying to generate academic research that supports their business model, add (maybe) Google. This piece in ProPublica discovered that the Stanford Center for Internet and Society had promised not to use any Google money to fund privacy research, after research done at Stanford led to a substantial fine for Google. The article was immediately followed by a lot of backpedaling and denials on everyone’s part (there’s an update on top), and it’s unclear at this point exactly what’s happening. The Stanford Center has also been the source of a lot of very good work on the Internet over the years.
That said, the blurring of boundaries between corporations and the academy has been going on for some time, and Stanford has always been at its epicenter. I suppose it’s encouraging that folks at least felt the need to deny the allegations
Several months ago, I argued here that big data is going to make a big mess of privacy – primarily because of a distinction between “data,” understood as the effluvia of daily life, generated by such activities as moving around town or making phone calls, and “information,” which implies some sort of meaning. Privacy protects the disclosure of “information,” since this can be an intentional act; big data allows surveillance of areas traditionally considered private without any act of disclosure, since the analytic computers will take care of turning the data into information. My standard talking-point here is a recent study of Facebook likes which determined that all sorts of non-trivial correlations could be deduced from what people “like:”