This summer I learned to walk. More precisely, I learned to walk normally. My gait had gotten unsteady, and I was dragging my right foot. Work with an excellent physical therapist helped straighten me out. But balance problems, tremors, and hesitations continued.
At the beginning of August I was diagnosed with Parkinson’s. I want to describe the phenomenology of my version of it, and begin thinking through its implications for the philosophy of perception and action. But first the disease itself.
Last week I received a widely distributed announcement on a conference celebrating "The 'Stanford School' of Philosophy of Science." The 'core' members of this school are taken to be: Nancy Cartwright (Durham), John Dupré (Exeter), Peter Galison (Harvard), Peter Godfrey-Smith (CUNY), Patrick Suppes (Stanford). The parenthesis are the current affiliation of the 'core' members; this immediately suggests that if there is a 'school' at all we are either dealing with a historical phenomenon or very distributed one. Scanning the list of the 'next generation' confirms that Stanford is not the current base of the purported school.
First, I adore much of the work done by many in the 'core,' but the idea that this group is a 'school' is deeply flawed. For, Suppes is far better understood (as he does himself) as belonging to the first generation (including Kyburg, Pap, Isaac Levi) intellectual off-spring of Ernest Nagel, who successfully created American analytical philosophy by combining the Scientific wing of Pragmatism with the new approaches emanating from Vienna, especially, and Cambridge (recall and here). In his autobiography, Suppes describes how assimilated from Nagel the significance of history of science.
Job-searches to fill permanent positions bring out the gremlins: long suppressed personal animosities; un-moored from reality-fantasies about the current significance of the department; conflicting aspirations about its future; mutually exclusive, external pressures about the required profile of the winning candidate, etc. Professional philosophers act just like humans during hiring season. Even when the gremlins remain suppressed a department can fail to spot the talent staring in its face; I have seen non-great departments pass up the realistic opportunity of hiring, say, Dave Chalmers or Alva Noë (etc.). Now, one reason why such things occur is that hiring as currently practiced in professional philosophy (and I have been affiliated with seven universities in three different countries, so I am aware of the variety of practices), tends to be largely a projection of a heteronomous soul (the department) onto a thinly covered slate (the candidate). This is why each individual hiring decision is best understood as a (unfair) lottery (and, thus, departments routinely fail to hire the best talent), even though in the aggregate there may well be some collective rationality because the list of explicit and implicit collective heuristics and biases (!) deployed track talent and effort reasonably well.
One might think that the previous paragraph is an argument for 'the inside candidate' (let's call it the 'TIC argument' or 'TIC' for short). For, the slate is then covered with a rich array of data-points. Now, anybody familiar with the long-run damage of 'nepotistic' hiring from within (name your favorite rotten European patronage system) will hesitate to endorse TIC; but, perhaps, the previous paragraph is an argument for TIC-lite: that is, at hiring one should favor ceteribus paribus the visiting adjunct/post-doc (etc.), even granting that personalities change post-tenure/civil servant status. I would endorse good-faith TIC-lite*, in fact, as introducing more sanity into our collective hiring practices, except that (a) IF the gremlins do come out in a TIC-lite situation it can poison an otherwise healthy atmosphere and (b) being a rejected TIC-lite candidate is really just about the worst possible professional experience short of economic exploitation in professional philosophy. (Of course, experiencing harassment, racism, etc. are far worse, but I wouldn't call these "professional.") Below the fold, I describe two first hand experiences to bring out two horrible features of TIC-lite (in my ongoing 'what it's like' for the young series). I name institutions, but (with a single exception) not individuals and I ask commentators to respect the privacy of all involved. (Well, I am a fair target, of course.)
Inspired by some comments of Jennifer Saul on Rebecca Kukla’s remarks concerning the “aggressive, argumentative” style in philosophy, Eric Schliesser and Catarina Dutilh Novaes here at NewAPPS have taken up the question of what I would call the character of philosophy. Does it consist in contests in which adversaries, having occupied positions, not only defend them vigorously but also attack those positions which, being contrary to their own, they take to be opposed to their own? Readers of Lakoff and Johnson’s Metaphors we live by will recognize here a familiar conceit: argument is war. How warlike should philosophy be?
Source: Jan Comenius, Orbis pictus (Syracuse, NY: C. W. Bardeen, 1887).
The illustration above is from Jan Comenius’ celebrated, oft-reprinted school-book. The Orbis sensualium pictus presents, in words and in pictures, “all the fundamental things in the world and all the acts of life”. In pictures because, after all, “in Intellectu autem nihil est, nisi priùs fuerit in Sensu” (a famous Aristotelian slogan). Knowing requires us to exercise our senses, perceiving by their means the differences of things, so as to lay the foundations of wisdom and right action. (Pictures, I should note, were still an expensive novelty, especially in books meant for children. Comenius had to have the Orbisprinted in Nuremberg, not in Patak where he was teaching.)
As some of you may know, Niall Ferguson engaged in a bit of gay-bashing yesterday (links below), holding that Keynes wouldn’t have cared about future generations because he was gay (the point is apparently taken from Gertrude Himmelfarb: see the Delong item referred to below). Now he has apologized. In my view no one is obliged to accept an apology: should we accept Ferguson’s and move on, as they say?
Henry Blodgett at Business Insider was one of the first with the story.
Tom Kostigen at Financial Advisor also reported on Ferguson’s remarks.
Job-talk season will soon be upon us, and before that the formidable Eastern APA. Well-appointed philosophers now come equipped with sleek slide-shows in which years of toil have been reduced to bullet points and fuzzy photos of colleagues in their offices. Although the Dark Ages of Powerpoint have passed, some presenters don’t take account of the difficulty certain members of their audience — among them, perhaps, the more eminent — will have in seeing their point. Literally.
▶Update: A very helpful comment from Teresa Blankmeyer Burke includes a link to guidelines recommended by the American Printing House for the Blind. The APH favors sans-serif fonts. I myself find Helvetica and its ilk harder to read in bulk than, say, Garamond. One compromise would be “slab-serif” fonts (the example below is American Typewriter; look here or here for more examples); “old-style” or “antique” fonts (Garamond, Janson, Goudy Old Style), may also work; avoid Times and other “modern” or “didone” fonts. The APH recommends emphatically that grey-scale graphics (“black-and-white” photos, shadowed letters) be avoided.
It’s easy to make your work accessible to the visually impaired. The bonus in doing so is that everyone will benefit. My aim here is simply to offer a few tips on accessibility. But since accessibility and good graphic design go hand in hand, the advice given here may be useful more generally.
(In remembrance of, among others, Captain Beefheart.)
It may well be that the conception of well-marked generations got its impetus from the world wars, now usually called One and Two. The first, once simply The Great War, was the war of my grandparents; the second, that of my parents. That distinction was clear, easy to remember, soundly based in events.
Number theory is notorious for producing conjectures that are easy to state but difficult to resolve. The Fermat theorem, stated in 1637—by Fermat, of course, in the margin of his copy of Diophantus’ Arithmetica—, requires nothing but a knowledge of basic arithmetic to comprehend fully. It was proved (by Andrew Wiles, building on the work of dozens of predecessors) only in 1995. The Goldbach conjecture (that every even number is the sum of two primes) and the twin primes conjecture (that there are infinitely many pairs of prime numbers p, q such that p + 2 = q), stated long ago, remain open.
A newer conjecture of this sort is the “ABC” conjecture. It has been a topic of excitement among mathematicians lately because a mathematician has made a credible claim to have proved it—but by idiosyncratic methods that other mathematicians will have to master before they can evaluate the proof. Proving it, moreover, would resolve a number of other outstanding problems in number theory.
▶ See the Wikipedia entry for more; see also Michael Nielsen’s very helpful page and list of references. I should note that of the news stories he refers to, the best is that from Nature; the New Scientist story should be ignored.)
In what follows I will describe in elementary terms the conjecture and its mathematical significance. The methods used by Shinichi Mochizuki in his claimed proof are very far from elementary. I won’t discuss them; follow the links if you want to know more. In a future post I will consider some philosophical questions suggested by the theorem and its proof.
I hate to be an optimist in response to Dennis Des Chene's dire predictions about the future of undergraduate education being turned into online distance learning environments. So, here follows a refresher course (recall this) on why folk choose to spend a lot
of money to have their children join an elite liberal arts college
education and the many quasi-elite institutions of higher learning that emulate the elite (I hope my debt to Veblen is clear) :
1. It is a form of status seeking and conspicuous consumption.
2. It provides a) a valuable social network, not to mention b) opportunities for assortative mating.
3. It is a life-style choice (see 1). Moreover, The enormous residential infrastructure
keeps retention/graduation rates very high (see also 4b).
4. It a) teaches a lot of important social skills in b) a protective environment.
5. And for a minority group of students it is an escape from one's class.
There is or was in economics a so-called law known as Gresham’s: bad money drives out good. Another law, of broader application, would have it that good enough dominates best.
The web, including Wikipedia—to which I just happily referred—, illustrates this point. Copying is easy, compiling is easy, finding new information is not so easy, even if that means simply reading journal articles and adding a bit to the existing common store. I have Fuch’s dystrophy, a hereditary disease of the cornea. Naturally I’d like to know all I can about it. I search online, diligently, repeatedly. What I find is the same half-dozen facts repeated again and again, often verbatim, from Wikipedia to the Mayo Clinic to NIH. As soon as one tries to investigate specific questions, e.g. about the risk of surgery, one discovers that the web has no answers. I would say that it is broad but shallow; yet even that conveys the wrong impression, since the “breadth” consists largely in repetition of a small core of fact and not obviously untrue speculation.
The information one gleans, with grains of salt for more dubious sources, is for many purposes good enough. Were you a journalist or a student needing two sentences on Fuch’s, you’d have them, quickly and without effort. But it is not much better than good enough. I think that that is a general tendency: the apparent wealth of the (publicly accessible) web belies a widespread poverty.
Good enough drives out best—and even better.
Apply this now to university teaching. Not long ago people were linking to, and praising, pieces in the Chronicle and the Globe and Mail written in defense of classroom teaching.
My initial topic is the attractions of scandal, and an oft-told story: Diderot, humiliated at the court of Catherine by his inability to answer Euler’s supposed mathematical proof of the existence of God, limps back home to Paris.
The moral generally drawn from the story is: Learn your algebra! My moral will be an admonition to historians (but not only to historians).
I’ve read the Diderot anecdote many times—mathematicians seem to like it—and I’ve long been suspicious. Inspired by a colleague’s use of it in a talk last semester, I did some checking. Here’s what I found.
John Conway explains, in “Higgs 101” at Cosmic Variance, why physicists think there has to be a Higgs field and a corresponding particle
The so-called “Higgs mechanism”, by which massive particles receive their mass, consists in the force of the field being “felt” by those particles as resisting their motion (as a ping-pong ball immersed in water “feels” the resistance of the water, and—in good Aristotelian fashion—therefore moves at finite speed). The Higgs mechanism “breaks” an original symmetry among fundamental particles, so that some, like the proton and electron, have mass and some, like the photon, don’t. Only massless particles can travel at the speed of light; all others must move more slowly.
(the “carrier” of the field, as the photon is the carrier of electromagnetic fields, and the hypothetical graviton of gravitational fields). This is not for the totally naïve, but if you have a decent impressionistic grasp of high-energy physics, Conway’s piece will give you a good account of the importance of the Higgs particle to the so-called “Standard Model” in fundamental physics. Were it not to exist, that model would have to be radically revised.
See also the video at PhD Comics. The viXra.org blog has a nice list of papers on electroweak symmetry and symmetry breaking, from Heisenberg in 1928 to Ellis, Gaillard, and Nanopoulos in 1976, which initiated discussion of ways to detect the Higgs particle.
Not long ago Eric Schliesser asked whether published results later shown false should be retracted. His example was Pauling’s 1953 paper on the structure of DNA. I agree with Eric that the record should not be tampered with, if by “retraction” one meant that the paper would be removed from the archive.
But in the case of another, also influential, paper I am inclined to think that a correction (not a retraction) is in order. The paper in question is a famous study by A. J. Bateman on sexual selection in fruit flies (see the references at the end). Gowaty, Kim, and Anderson have recently replicated the study. They showed that his methods are incapable of established the intended conclusion, which was that “sexual selection acted primarily on males through female choice and through male competition and profligacy in mating”.
We are unique in reporting a repetition of Bateman […] using his methods of parentage assignment, which linked sex differences in variance of reproductive success and variance in number of mates in small populations of Drosophila melanogaster. […] Bateman’s method overestimated subjects with zero mates, underestimated subjects with one or more mates, and produced systematically biased estimates of offspring number by sex. Bateman’s methodology mismeasured fitness variances that are the key variables of sexual selection.
A recent post here by Helen De Cruz and a not-so-recent post elsewhere by Eric Schwitzgebel will serve as hooks from which to hang some thoughts about two complementary illusions: the transparency of the present, the opacity of the past.
Helen, a lutenist, asks whether she can “ever claim to understand” the late sixteenth- and early seventeenth-century music she’s playing. There is a “gap”—a gap familiar to anyone who has undertaken to perform early music—between us and the works.
Schwitzgebel, on the other hand, is not much worried about our access to old texts.
Maybe empirically oriented philosophers typically don’t regard themselves as expert enough in history of philosophy to write about it. But I think we hobble ourselves if we allow ourselves to be intimidated. The standard of expertise for writing about Descartes or Kant in the context of a larger project — a project that isn’t just Descartes or Kant interpretation — shouldn’t be world leadership in Descartes or Kant interpretation. It should be the same standard of expertise as in writing about a contemporary colleague with a large body of influential work, like Dennett or Fodor.
In a way I agree with Schwitzgebel. A “world leader” in Descartes scholarship knows far more, in some respects, than you or I need to know in order to read the Meditations. But what exactly is the “standard of expertise” required in writing about one of our contemporaries?
Rather than reminding Young that he still has not done the minimally acceptable thing in this matter, we will just quote him as an example of a modern-day apology: in the apt words of my colleague, Dennis Des Chene, "deny having done anything exactly, um, wrong, while reducing your victim to a mere compiler of timetables." Young writes:
"That certain of Cate's phrases appeared in my book is entirely due to my inexperience and carelessness as a biographer. Sometimes a phrase just stuck in my head, appropriated so completely that it seemed to be my own. The main problem, however, was this. Cate was where I first began to try to grasp the facts of Nietzsche's life. Consequently, my notes on his book were written four or five years before I began to write the biography itself. Coming across a phrase in my notes I too quickly took it to be a précis of my reading of Cate whereas it now transpires that occasionally it was Cate's own phrase. Trained as I am to be on guard against unacknowledged use of other people's ideas, I was too relaxed when it came to the manner of reporting biographical facts. Without properly thinking about it, I tended to assume—wrongly—that the manner of reporting humdrum historical facts no more counts as intellectual property than the manner of reporting a bus timetable. Since Cate appeared in my bibliography I assumed it would be obvious that I had used him as a source of basic historical data. This was naïve and thoughtless."
Via Language Log, an irresistable remix of the Liar Paradox, from Ed McBain’s Blood Relatives (1975). The setup includes Police Detective Carella, a memo from the Commissioner about memos, and a rubber stamp. Go read the rest…
The University of Missouri does not rank very high even among US state universities. It is my impression that, unlike Berkeley, Minnesota, Indiana, Ohio State, and other land-grant universities, the University of Missouri has never aspired to be a “public Ivy”. Nor has the state government encouraged such aspirations. The University’s history—an increasingly familiar story—is one of chronic underfunding, in recent years made worse by a heavily Republican legislature.
On Thursday last week, buried in the third paragraph of a boringly-titled press release that reiterated the President’s “six priorities for the coming year” (including, of course, “excellence”) was the news that the University of Missouri Press will be “phased out” in the 2013 fiscal year (which starts on 1 July 2012). Ten employees face, on very short notice, an uncertain future.
Jonathan Mirsky writes in the current New York Review concerning the actions of the organizers of the London Book Fair (16–18 April) in cooperating with the Chinese General Administration to exclude certain writers, some still in China, some in exile, from their official presence at the Fair (“Bringing censors to the Book Fair”, NYRB 59.9, 24 May 2012). He is drawing on the work of Nick Cohen at the Observer and also perhaps of Richard Lea at the Guardian.
This is one of those cases where the statements of the people in charge suffice to exhibit their abjectness.
Cutting-edge technologies provide apt objects for nostalgia. Their time is brief, their promise of a dust-free, odor-free, worry-free future so quickly fades that one hardly has time to notice, or to resent, the hyperbole. The Polaroid process was once a marvel—it still is, really: pictures in a instant, film that develops itself, yielding, in just seconds not days, the image of your loved ones or of the natural wonders you think you’d like to be reminded of someday.
But sometimes you pulled the layers apart too soon. Or you kept the film too long. The chemicals did their work, regardless. William Miller (via La Boîte verte) has collected “Ruined Polaroids” with striking results. Some, like this one, look like Abstract Expressionist paintings, others like the remains of mysterious disasters or the surfaces of faroff moons.
Miller has also published striking photographs of the polluted waters of the Gowanus Canal. See his weblog for more.
You can still buy a Polaroid Instant Camera. “Instant is back!” says the company’s website.
Not long ago I suggested that it would be better if philosophers did not read their papers but instead presented their work ex tempore, as is the norm in many other disciplines. Lively comments ensued. I recommend them to anyone who is new to giving talks, and even to those who, like me, aren’t but feel sometimes as if they were. From the well-travelled Cosma Shalizi at Three-Toed Sloth you can now read 2000 more wise words of advice on academic talks. Even if you’ve seen it all before (I haven’t), still this is the kind of thing that’s well worth repeating: the kind of thing, in other words, that we in some sense know but are very likely to forget when the fatal hour arrives (is there a theory of knowledge that takes account of this?).
Here’s a lovely issue that could not have arisen fifteen or twenty years ago. A man in Japan has obtained an injunction ordering Google to alter the auto-complete function so as not to associate his name with crimes (AFP via physorg.com; see also Japan Times). Google has refused to comply.
“A Japanese court issued a provisional order requesting Google to delete specific terms from Auto-Complete,” a spokesman for the California-based company confirmed on Monday. “The judge did not require Google to completely suspend the Auto-Complete function,” he continued. “Google is currently reviewing the order.” Tomita said that spelling his client’s name in a Google query box in Japan resulted in suggested searches that lead to results implying the man is guilty of crimes.
Imagine that every time someone typed your name into the Google search box, it was completed with the words ‘wants to kill Obama’ or ‘has sex with babies’. The suggested search phrase (as Google notes) isn’t, of course, written by anyone. It’s generated automatically. Moreover, a search phrase is not, one would think, an assertion. In offering it as an option, Google isn’t claiming, whenever “[Your name] has sex with babies” shows up in the auto-complete list, that you have sex with babies.
Generally speaking, I prefer that a talk be a talk and not the reading of a paper (in almost no discipline other than ours do people regularly read their papers). Of those who don’t read, some talk from notes or slides. There seems to be some value attached to making do with less. At a conference long ago, one speaker gave his thirty-minute talk from a single index card; a later speaker topped that by giving his talk from what amounted to a postage stamp. Some speakers dispense with all aids to memory and talk entirely as if extemporaneously. I have had colleagues who were very impressed with speakers who did so. I am wondering now whether it is reasonable to be thus impressed.
For my part, I would say that since the ability to give a talk without notes could indicate merely a capacity to get things by heart, and since that capacity seems, on the face of it, not to have much to do with the quality of the philosophy one produces, we haven’t much reason to be impressed by extemporaneity. Nevertheless I must admit that I am sometimes impressed by it, I suppose because, as an expression of confidence, it seems to confer a certain authority on the speaker. But confidence is hardly a guide to veracity, and still less are expressions of confidence.
So: are you impressed by speakers who talk without notes? If so, why?
Over at Choice & Inference, Jeff Helzner asks for raw Philosophical Gourmet Report data to be made publicly available. He notes that, “The PGR is based on an analysis of certain data sets, but there is often more than one reasonable way to analyze a data set,” and in fact it is now standard practice in the social and natural sciences to make raw data available upon request. Unless there are particular reasons for sequestering its data, it seems reasonable that the PGR should follow suit. The question of transparency of data has arisen in some other contexts recently. Andrew Gelman (to whom Helzner links) writes of the issues surrounding Mark Hauser that if the raw data had been publicly available, those issues would have been resolved quickly. A while ago there was considerable controversy regarding failures of transparency at The "Pluralist's Guide". We have noted the response of editors of major journals in HPS to the lack of transparency at the European Science Foundation journal ranking. In view of these and other recent cases, transparency with respect to data—as advocated by the “Open Data” movement—has become an urgent imperative, especially when the analysis of that data has serious practical consequences. Though in no way implicated in these unfortunate episodes, PGR, just because it has become the most relied-upon venue for the current ranking of philosophy departments, should set a good example by making its data publicly available.
From Malebranche via Brandon at Siris comes this ever-timely dictum:
THÉODORE: … The great secret of delivering oneself from a great many importunate people is to talk reason. This language, which they do not understand, sends them off for good, without their having grounds for complaint.
Nicolas Malebranche, Dialogues on Metaphysics and on Religion VI.viii (Pléiade 2:775; orig. publ. 1688)
A bit earlier the seeming pride here exhibited by Théodore proves to be nothing of the sort.
The Canadian Association of University Teachers has taken note of an unwise and unnecessary agreement reached by the University of Toronto and the University of Western Ontario with Access Copyright, a nonprofit organization which collects money from universities and other institutions for disbursement to copyright holders.
All is sweetness and light according to the administrators quoted in a press release from Access Copyright:
“We believe that this agreement is fair for all the parties — those who create the materials, as well as students who gain access to copyright materials through the University,” said Cheryl Misak, University of Toronto Provost.
“This enables, within certain limits, reproduction of copyright material for students’ use without concern for infringing on copyright restrictions.”
“This agreement gives us a convenient, comprehensive way to share content digitally and in paper form from a repertoire of millions of publications,” said Janice Deakin, Provost and Vice‐President (Academic) at Western. “The backdating of the agreement gives us peace of mind by covering past digital uses that may have exposed the university and the indemnity provision increases the university’s legal protection against copyright infringement.”
Some of the largest structures built by humans are invisible or go largely unnoticed. The shorelines around big cities like New York have been almost completely subordinated to the needs and wants of their inhabitants. Dredging plays a large role in the building of artificial boundaries between land and sea. BLDGBLOG, a must-read for anyone interested in architecture, reports on an exhibit by the Dredge Research Collective.
The Dredge Cycle is landscape architecture at a monumental scale, carving the coastlines and waterways of continents according to a mixture of industrial need and unintended consequences. Thus far, dredge has remained the domain of logistics, industry, and engineering, a soft successor to the elevated freeway interchanges and massive dams that captured the infrastructural imagination of the previous century.
Mohan Matthen has been commendably frank in expressing his attitude toward the history of philosophy. But in fact what he has expressed does not pertain to history in particular: it pertains to any source of inspiration. It is what you might call the practitioner’s attitude.
When one reads Julia Annas or Margaret Wilson or Michael Friedman one thinks: Gee that’s really interesting. Could Aristotle or Descartes or Kant really have thought that? […]
And then one thinks: Oh who cares? It’s so interesting that it’s worth tackling on its own. But, no doubt, it gives the question a certain beautiful frisson that Kant could have held it.
Nothing at all changes if you substitute ‘Susan Wolf’ or ‘Mohan Matthen’
for ‘Aristotle’ or ‘Descartes’. For me as a practitioner it’s of no especial import to get Aristotle or Mohan right so long as I have arrived at something of interest to me. It may be that a historian has helped me along toward that end. But what matters is that I have been inspired, that I am having interesting thoughts.
To that end presumably whatever works is licit, so long as it is not morally objectionable. I could just as well as have consulted tea leaves. I am, so far as inspiration is concerned, a pure egoist, a solipsist even, since it hardly matters whether I have listened to you or merely dreamt of listening to you.
The attitude of the practitioner ought not to be given any weight, therefore, to in judging the worth of history. It is too indiscriminate. The practitioner is indifferent to anything that fails to inspire interesting thoughts, and will value anything that does, whatever its intrinsic worth.