Religious disagreements are conspicuous in everyday life. Most societies, except perhaps for theocracies or theocracy-like regimes, show a diversity of religious beliefs, a diversity that young children already are aware of. One emerging topic of interest in the social epistemology of religion is how we should respond to religious disagreement. How should you react if you are confronted with someone who seems equally intelligent and thoughtful, who has access to the same evidence as you do, but who nevertheless ends up with very different religious beliefs? Should you become less confident about your beliefs, or suspend judgment? Or is it permissible to accord more weight to your own beliefs than to those of others?
In November and December 2014, I surveyed philosophers about their views on religious disagreement. I was not only interested in finding out what philosophers think about disagreements about religious topics in the profession (for instance, do they consider other philosophers as epistemic peers, or do they take the mere fact of disagreement as an indication that the other can't be right?), but also in the influence of personal religious beliefs and training. I present a brief summary of results below the fold; a longer version can be found here.
What do philosophers think about religious disagreement? This is a brief survey (takes about 5-10 minutes) to find this out. The survey is aimed at academic philosophers, by which I mean people who hold a PhD in philosophy or are graduate students in philosophy. If you fit these criteria, please consider participating. Participation is fully anonymous.
The format of the study is a multiple choice questionnaire. I will ask some personal questions, amongst others about your religious views, but your name will not be asked. To further take care that your anonymity is preserved, I will not report on individual responses but report statistical patterns. There are a few places where you can provide an open response (optional). I will publish at most one open response per participant, making sure that there is no identifying information within your response. The full dataset will remain confidential and will not be shared with anyone. I will report the preliminary results on NewApps and one or two other websites.
The study is designed and carried out by Helen De Cruz, postdoctoral fellow of the British Academy at the University of Oxford. If you have any questions or concerns, please contact helen.decruz- at - philosophy.ox.ac.uk. To participate, please click here or paste this link in your browser: https://surveys.qualtrics.com/SE/?SID=SV_9TFkp1QkxnZkdTL
A few years ago, I read the Philosophy Smoker on a regular basis. In the comments threads, several job seekers complained about older professors who didn't retire. If only they finally went away, more tenure lines would become available for junior people. In a provocative essay, professor emerita Laurie Frendrich argues along similar lines. She argues that professors have a moral duty to retire. The reasons why they don't, she argues, are largely self-serving: the large income of a senior faculty member, the pleasure of teaching: "Professors approaching 70...have an ethical obligation to step back and think seriously about quitting. If they do remain on the job, they should at least openly acknowledge they’re doing it mostly for themselves."
Unlike in the US, where the mandatory retirement age of professors at 70 was lifted in 1994, European professors are still obliged to retire when they reach a given age (usually between 65 and 67). It is certainly a good thing that tenured lines eventually open again and younger academics can step in. But does that mean that older professors in the US also have a moral obligation to step down when the time comes? For one thing, many tenured positions aren't being replaced by junior tenure lines but by contingent (VAP, adjunct etc) positions. Also, pension schemes were gutted during the 2008 and following years crisis, which made it financially precarious for older professors to retire.
I agree high retirement ages are problematic, but I disagree that individual older professors have a moral duty to retire. If the total effects of scrapping the pension age for professors are negative, that should be a reason to reintroduce mandatory pension age in academia in the US, but it does not put the burden of that decision on individual professors in their late 60s or older. Let's look at some of the main arguments Frendrich offers:
The following letter was adopted by the Northwestern University Philosophy Graduate Students by way of a vote:
As many in the philosophical community already know, sexual misconduct is a prevalent problem in the discipline. Our department is currently bearing the weight of its own controversy regarding sexual misconduct, and we fear that particular developments in the situation at our institution could have far-reaching consequences for victims elsewhere and going forward. After being accused by two different students of sexual assault, and being found responsible for sexual harassment by the university’s Sexual Harassment Prevention Office in each case, a member of our faculty responded by suing (among others) the university, one of his colleagues, and, most troubling to us, a graduate student.
While some of the legal matters await final resolution, litigation itself raises new practical and ethical considerations. Anyone named as a defendant in a legal complaint will naturally be advised by her* attorney not to discuss the matter with others. The silencing effects for a student who is sued by a professor for alleging sexual misconduct within her university's own reporting procedures must be distressing: she will likely be isolated against her will from educational, professional, and personal support networks by someone who has proven willing to sue not only her, but also those who would support her.
Many instances of sexual misconduct go unreported, in large part because the risks of reporting are many and serious while the potential gains are very slim. The risks of further loss of community, of damaging actual and potential professional relationships, of not being believed, or of being reduced to how one has been treated, rather than being perceived as an intelligent, talented, and valuable individual and member of a community, already deter victims from reporting. Add to these risks the possibility of being named in a lawsuit (and the consequent potential for financial ruin if not indemnified) as well as having personal information and details of a traumatic experience made public, and the hazards are substantially multiplied.**
The pursuit of this legal strategy and its silencing effects should be troubling to the philosophical community. Suing a graduate student for filing an internal, and otherwise confidential, sexual misconduct complaint is intolerable. If the legal strategy implemented by this faculty member is treated as acceptable, it is not only injustice to our fellow graduate student that is at stake--though this is a substantial concern in itself. The implications for victims going forward, within the profession and otherwise, are staggering. Treating such an approach as acceptable effectively amounts to accepting those implications, as well as silence for ourselves. It should be noted that even if a faculty member feels that their due process rights have been infringed upon, and even if no university grievance process is available, there are other courses of action available under, for example, Title VII and Title IX. Consequently, we feel we must vigorously repudiate this legal tactic and provide vocal support to those whose voice would be taken from them.
One necessary step to adequately supporting victims is opening up communication between administrators, faculty, and graduate students, both within departments and across them. We hope that the discipline’s internal conversation regarding sexual misconduct will continue, that it is honest yet sensitive to issues of vulnerability and power dynamics, and that we do not avoid confronting discrimination and exclusion because we are too concerned with privacy to do anything at all. When we act as if privacy concerns cannot be appropriately balanced with substantive communication, we only exacerbate the stigma that victims already feel. The fact that philosophy departments have become a central battleground for rights to non-discrimination in academic settings is a result of a collective failure on the part of our discipline, and as such will require collective action to rectify it. We hope this letter will serve as a small step towards that end.
We admire our fellow graduate student for her strength and bravery and are proud to share an intellectual home with a colleague who is both uncommonly brilliant and courageous. We stand with her, and in support of victims everywhere.
*While we are using the feminine pronoun, this is applicable to persons of any gender.
**Here, we do not mean to claim that personal information need accurately be made public. Unfortunately, whoever initiates a legal complaint has the advantage of being able to make public a narrative of events that indeed need not be accurate, or even approximately accurate, unless it is to prevail in court. When filing a motion to dismiss, a defendant does not have the opportunity to dispute matters of fact. This only underscores our concern about the impact this new precedent may have on victims.
In a forthcoming paper, John Schellenberg forwards the following argument: anatomically humans have been around 200,000 years. That's a very short span of time for any species, and only in the past few thousand years ago have we been reflecting on the world around us. If we our species survives even as long as Homo erectus did, we've only completed a very small part of a potentially long future of thinking about religion, metaphysics and other matters.
At present, philosophy of religion in the analytic tradition is quite narrowly focused:
"in the west – and I expect I am writing mainly for western readers – philosophy of religion has been largely preoccupied with one religious idea, that of theism, and it looks to be moving into a narrower and deeper version of this preoccupation, one focused on specifically Christian ideas, rather than broadening out and coming to grips with its full task."(p. 3).
Theism, in a generic, omni-property sort of way, is one position that philosophers of religion commonly defend. The other is scientific naturalism. These seem to be the only games in town:
"most naturalists too assume that theistic God-centered religion must succeed if any does. Naturalism or theism. These seem to be the only options that many see. The harshest critics of religion, including philosophers such as Daniel Dennett, seem to think their job is done when they have, to their own satisfaction, criticized personalistic, agential conceptions of a divine reality." (pp. 3-4).
At the end of 2013, I conducted a qualitative survey (summary here, but I am writing up the paper presently) among philosophers of religion. Next to a series of open questions, there was a question for open feedback. I was quite surprised to see so many philosophers of religion openly lament the lack of subject diversity in their discipline. Just a few choice examples written by anonymous respondents:
Argumentation gets a bad press. It’s often portrayed as futile: people are so ridden with cognitive biases—less technically, they are pigheaded—that they barely ever change their mind, even in the face of strong arguments. In her last post, Helen points to some successes of argumentation in laboratory experiments with logical tasks, but she doubts whether these successes would extend to other domains such as politics or morality.
I think this view of argumentation is unduly pessimistic: argumentation works much better than people generally give it credit for. Moreover, even when argumentation fails to meet some standards, the problem might lay more with the standards than with argumentation. Here are some arguments in support of a view that is both more realistic in its aspirations and more optimistic in its depiction of argumentation—we’ll see if these arguments can change Helen’s mind about the power of arguments.
It is well-attested that people are heavily biased when it comes to evaluating arguments and evidence. They tend to evaluate evidence and arguments that are in line with their beliefs more favorably, and tend to dismiss it when it isn't in line with their beliefs. For instance, Taber and Lodge (2006) found that people consistently rate arguments in favor of their views on gun control and affirmative action more strongly than arguments that are incongruent with their views on these matters. They also had a condition where people could freely pick and choose information to look at, and found that most participant actively sought out sympathetic, nonthreatening sources (e.g., those pro-gun control were less likely to read the anti-gun control sources that were presented to them).
Such attitudes can frequently lead to belief polarization. When we focus on just those pieces of information that confirm what we already believe, we get further and further strengthened in our earlier convictions. That's a bad state of affairs. Or isn't it? The argumentative theory of reasoning, put forward by Mercier and Sperber suggests that confirmation bias and other biases aren't bugs but design features. They are bugs if we consider reasoning to be a solitary process of a detached, Cartesian mind. Once we acknowledge that reasoning has a social function and origin, it makes sense to stick to one's guns and try to persuade the other.
Like an invisible hand, the joint effects of biases will lead to better overall beliefs in individual reasoners who engage in social reasoning: "in group settings, reasoning biases can become a positive force and contribute to a kind of division of cognitive labor" (p. 73). Several studies support this view. For instance, some studies indicate that, contrary to earlier views, people who are right are more likely to convince others in argumentative contexts than people who think they are right. In these studies, people are given a puzzle with a non-obvious solution. It turns out that those who find the right answer do a better job at convincing the others, because the arguments they can bring to the table are better. But is there any reason to assume that this finding generalizes to debates in science, politics, religion and other things we care about? It's doubtful.
This article in Aesthetics for Birds has some interesting statistics on the percentage of papers authored or co-authored by women and minorities in the top print aesthetics journals: Journal of Aesthetics and Art Criticism and British Journal of Aesthetics. About 20% of articles in these journals are written by women in the period from 2010 onwards. When we look at memberships of professional aesthetics organizations, the percentage of female aestheticians is about 32%. So that means women are underrepresented in JAAC and BJA. What can account for this disparity? JAAC keeps a record of gender and geographic location of submissions.
Sherri Irvin finds "It is notable that over the past three years, women authors have submitted to JAAC at a rate substantially higher than the rate at which they are published in JAAC from 2010-2014, and closer to the proportion of women members in the ASA. During 2 of the last 3 years, the acceptance rate for women has been lower than for men. Though the differences seem small (only 2-3 percentage points), another way of putting them is that in 2012-3, men were 21.4% more likely than women to have their manuscripts accepted, while in 2013-4, they were 11.6% more likely." She also writes "US submissions tend to be accepted at a rate slightly over 20%, while submissions from non-English-speaking countries tend to be accepted at far lower rates".
JAAC practices double-anonymous refereeing. I am in the statistics, since I've co-authored an article that was published in JAAC in 2011. My co-author and I were very pleased with thoroughness of our reviewer, who is one of the few experts on the aesthetics of paleolithic art. We could guess who he was, and it turned out (as he later communicated with us) he also had an inkling as to who we were. Aesthetics is a small world. The only time I reviewed for JAAC I didn't know who the author was, so I believe I reached a verdict that was unsullied by considerations of the author's identity. But was it? Thoughts about the identity of an author can play a role in one's decision, even if you don't want to, this is after all how implicit bias works.
There have been lots of discussions on the PGR (e.g., here), especially on its leader, Brian Leiter, including a poll on whether the of 2014 should be produced. Regardless of the outcome of this, I think we can already start considering alternative ways, independent of the PGR, to provide information for prospective philosophy graduate students.
Ideally, such information should should not be primarily about rankings of quality. Quality is a complex concept that is vulnerable to biases and enforcing the status quo. We should rather provide prospective grad students with clear measures of placement rates and places where they could study the topic of their choice. Perhaps any type of ranking will be problematic. We could just provide descriptive info on a wide range of topics, e.g., where are places to study experimental philosophy, continental French etc. One can give that info *without* giving an overall rank of perceived quality.
The methodology by which placement rates are made and by which assessments of strengths within departments are made should be empirically informed by the social sciences e.g., in its selection of experts who make these assessments
Collecting and dessiminating this information shouldn't be in the hands of one individual but shared responsibility. I originally thought it was something the APA, or perhaps a task force consisting of people from the APA, the AAP etc could do, but I am now not so sure whether this is a good idea. PhilPapers+ seems like a good place to host the information, especially given that prospective graduate students will already be familiar with PhilPapers
It would be nice to expand information for prospective graduate students to non-Anglosaxon departments. There are lots of grad students outside the English-speaking world who could benefit from lists of placement records and specializations of faculty members outside the US, UK etc.
In a recent survey, I asked philosophers about their submissions to journals, to get a sense of what journals people submit to and also what factors might influence their decisions on where to submit papers. Specifically, I wanted to know how frequently people submit their work to the top 5 journals in philosophy, which are usually regarded (according to polls) as the best journals in the field: Philosophical Review, Journal of Philosophy, Mind, Noûs and Philosophy and Phenomenological Research. Increasingly, publications in these journals are regarded as a marker of excellence.
However, there are several hurdles to getting published in the top 5. The acceptance rates are forbidding (I don’t have exact numbers, but journals in the top-20 that have published acceptance rates as low as 5%, (e.g., Australasian Journal of Philosophy, Canadian Journal of Philosophy). Presumably, the acceptance rates in the top-5 are lower still, making them more difficult to get into than Science or Nature. Also, review times at some of these journals tends to be longer than the standard 3 months. Those journals that are quicker close submissions for half the year, and unfortunately, they do so concurrently (otherwise, so a senior philosopher pointed out to me, they wouldn’t have the lower submission rates they are aiming for).
251 philosophers completed the survey. Below the fold is a summary of some results. I asked respondents to say how many papers they submitted to top-5 journals and any refereed journal over the past year (i.e., since September 2013).
Although over half the world' population are theists (according to Pew survey results), God's existence isn't an obvious fact, not even to those who sincerely believe he exists. To put it differently, as Keith DeRose recently put it, even if God exists, we don't know that he does. This presents a puzzle for theists: why doesn't God make his existence more unambiguously known? The problem of divine hiddenness has long been recognized by theists (for instance, Psalm 22), but only fairly recently has it become the focus of debate in philosophy of religion.
In several works, J.L. Schellenberg has argued that divine hiddenness constitutes evidence against God's existence. A simple version of this argument goes as follows (Schellenberg 1993, 83):
If there is a God, he is perfectly loving.
If a perfectly loving God exists, reasonable non-belief in the existence of God does not occur
Reasonable non-belief in the existence of God does occur.
No perfectly loving God exists.
There is no God.
The controversial premises are 2 and 3. Authors like Swinburne and Murray have argued against premise 2: God may have reasons to make his existence less obviously true. Their arguments state that if we knew God existed, we wouldn't be able to make morally significant choices. This is an empirical claim. Obviously, it cannot be experimentally tested directly. However, research in the cognitive science of religion (CSR) on the relationship between belief in God and morality may indicate whether or not this is a plausible claim.
New APPS readers probably remember Helen De Cruz's excellent post on the polarized debate surrounding evolutionary science (which was picked up by NPR), as well as Roberta Millstein's follow-up post on the perhaps equally polarized debate concerning climate change. Both posts cite the work of Dan Kahan, who has a distinct take on these issues:
"I study risk perception and science communication. I’m going to tell you what I regard as the single most consequential insight you can learn from empirical research in these fields if your goal is to promote constructive public engagement with climate science in American society. It's this: What people “believe” about global warming doesn’t reflect what they know; it expresses who they are."
I just attended a talk by Michael Ranney, who opposes Kahan's position. In Ranney's view, communicating the mechanism of global climate change is enough to change the minds of people on both sides of the political spectrum. (Check out the videos!) Ranney shows, surprisingly, that just about no one understands the mechanism of climate change (Study 1). Further, he shows that revealing that mechanism changes participants' minds about climate change (Study 2).
Google the keywords “academic” and “mother” or “motherhood”, and you will find various websites with discussions about the baby penalty in academia for women. Representative for this literature is an influential Slate article by Mary Ann Mason, who writes “For men, having children is a career advantage; for women, it is a career killer. And women who do advance through the faculty ranks do so at a high price. They are far less likely to be married with children.”
As an untenured mother of two children, I find these reports unsettling. When my second child was born, several women who are junior academics approached me to ask me if it was doable, or how I managed to get anything done. They wanted children but were scared that it would kill their careers. How do children impact one’s work? This got me thinking that it would be good to hear the stories of philosophers who did manage to combine a flourishing academic career with parenthood.
To this end, I interviewed seven tenured professors who are parents. Six of them are mothers, but I decided to also include an involved father. I aimed to include some diversity of circumstance. Some of my interviewees have very young children whereas one respondent has grown children, she had them in a time when being a mother and a professor was even less evident than it is now. One of my interviewees is a single mother, who had her child in graduate school. One went to a first-round APA interview when her son was six weeks old, with a sitter in the hotel room. Two of my interviewees have special needs children, a fact that shaped their academic careers in important ways. I aimed also for geographic diversity—my respondents come from the US, the UK, Canada and The Netherlands—since countries and institutional culture differ in the formal and informal support parents receive, such as paid leave and childcare.
I have long believed the conventional wisdom that women are not proportionately distributed through every subfield in philosophy. In my field of theoretical ethics, in particular, it is often said that more women in philosophy seem to be found here than are in the profession more widely.
I believe it a little less today, though it may still turn out to be true. Trent University student Cole Murdoch undertook a short summer research project for me, looking at the ratio of male to female authors in two leading journals of moral philosophy.
Although we've still data to wade through, it is interesting to me that in looking at a five-year window of publications in Ethics and Journal of Moral Philosophy, the student did not find that women-authored articles appeared in much greater numbers than our number in the profession. I tasked him with this merely to find out who and what the journals in my field publish, for self-interested reasons, but I also expected that, as we regularly hear women in philosophy disproportionately specialize in ethics, he'd find much more parity in JMP and Ethics, or at least, higher numbers of women's names than one might find in the profession. [see below for a report of the analysis]
How can we combine the economic necessities of work with caring for infants? This dilemma recurs across cultures, and western culture is no exception. In a series of interviews with professors who are mothers (which I hope to put on NewApps by the end of this month), one of my respondents, who has grown children remarked about their preschool years:
"I was completely stressed out. It wasn’t just that childcare was expensive—and even with two salaries it was a stretch: It was insecure. If a childcare provider decided to quit, I would be left in the lurch; if my kid wet his pants once too often he’d be kicked out of pre-school [which had strict rules about children being toilet-trained] and I’d have to make other arrangements."
This concern resonates with many parents. It is especially acute among low-income, single mothers who struggle to find last-minute childcare to fit their employers' unpredictable scheduling. Also symptomatic are heart-wrenching stories about a woman whose children were taken away because she failed to find childcare when she had to go on a job interview and left them in a car, or a woman who was arrested for allowing her nine-year-old daughter to play in a park while she worked in a nearby fast food restaurant.
Can we learn anything from how other cultures solve the working mother's dilemma?
Thomas Reid argued that the human default trust in testimony is a gift of nature, which is sustained by two principles that "tally with each other", the propensity to speak the truth, and the tendency to trust what others tell us. Interestingly, he observed an embodied aspect of this trust:
It is the intention of nature, that we should be carried in arms before we are able to walk upon our legs; and it is likewise the intention of nature, that our belief should be guided by the authority and reason of others, before it can be guided by our own reason. The weakness of the infant, and the natural affection of the mother, plainly indicate the former; and the natural credulity of youth, and authority of age, as plainly indicate the latter. The infant, by proper nursing and care, acquires strength to walk without support (1764, Inquiry into the Human Mind, chapt VI, Of Seeing)
Reid's observations point to an intriguing possibility: to what extent is social cognition, such as trust in testimony, influenced by our bodily position, in particular the position we have as helpless infants? The Japanese primatologist Tetsuro Matsuzawa has argued that the supine position (that is, position on the back) of human newborns, has been a decisive factor in the evolution of human social cognition.
Humans and chimpanzees differ quite markedly in how much they trust others. For instance, although both chimpanzees and humans imitate, human children are more prone to overimitation than juvenile chimps, the children, but not the chimps, indiscriminately follow actions by an adult that are reduntant in obtaining a desired result (see e.g., here).
In order to examine and address issues of participation faced by minority and underrepresented groups in academic philosophy (e.g. gender, race, native-language, sexual orientation, class, and disability minorities), a number of UK departments have recently started to build a UK network of chapters of MAP ( www.mapforthegap.com ).
With 24 active chapters to date, MAP (Minorities And Philosophy) is already a successful and widespread organization in the US and elsewhere. If you would like to have a MAP chapter at your own institution, this Call For Collaborators is for you. MAP chapters are generally run by graduate students (typically 3 or 4 per department), with some help from academic staff members; undergraduate participation is also encouraged.
At this stage we would be happy to hear especially from graduate students (groups or individuals) at UK Philosophy departments as well as from UK Philosophy academic staff who would like to coordinate graduate student interest in their institutions. Please contact Filippo Contesi (filippo.contesi at gmail dot com).
To my knowledge, full book manuscripts are never reviewed anonymously. Given that the double anonymity of peer review is implemented to decrease biases, and presumably, thereby increase the focus on the quality of the writing, this is puzzling. David Chalmers wrote, in a very helpful comment on how to publish a book "Most book refereeing is not blind, unlike journal refereeing. And when what's being reviewed is a proposal rather than a full manuscript, reputation of the author make a huge difference in reviewers' and editors' confidence that the proposal will be fleshed out well to a book."
While I can see that the reputation or renown of an author can relevantly play a role at the proposal stage in assessing the competence of the prospective author in writing a full manuscript, I don't see why it should play a role when the full manuscript is reviewed. This will inevitably happen when review of full manuscripts is non-anonymous. It would be hard not to be influenced if the author of one's manuscript happened to work at high-ranking institution, is very senior, and already has an excellent track record (I declined to review a book for a major press for this reason), or conversely, if the author is relatively junior, working at a teaching-focused or obscure place.
This is part 3 of a 3-part series of interviews with philosophers who left academia right after grad school or in some cases later. See part 1 to see what jobs they held, and part 2 on how they evaluate their jobs. This part will focus on the transferrable skills of academics.
The burning question of academics who want to leave academia is: What transferrable skills can they bring to the private sector? The responses of the seven people I interviewed clearly indicate that the skills that are transferrable are broad and fairly high-level.
This is part 2 of a 3-part series of interviews I conducted with seven philosophers who went on to a non-academic career after obtaining their PhDs. For more background on these philosophers, the work they currently do, and the reasons they left academia, see part 1: How and Why do they end up there? This part will focus on the realities of having a non-academic job.
One of the main attractions of an academic job, especially one of a tenured academic professor, is the autonomy (intellectual and in terms of time management) it provides. However, there are downsides as well: the increasing pressure to churn out publications (which some of the respondents already alluded to in part 1, lack of support, and isolation lead to mental health problems in some academics. So how do philosophers with experience in academia and outside evaluate the work atmosphere?
This is the first of a three-part series featuring in-depth interviews with philosophers who have left academia. This part (part 1) focuses on their philosophical background, the jobs they have now, and why they left academia. Part 2 examines the realities of having a non-academic job and how it compares to a life in academia. In part 3, finally, the interviewees reflect on the transferable skills of a PhD in philosophy, and offer concrete advice on those who want to consider a job outside of academia.
Does having a PhD in philosophy mean your work opportunities have narrowed down to the academic job market? This assumption seems widespread, for example, a recent Guardian article declares that programs should accept fewer graduate students as there aren’t enough academic jobs for all those PhDs. Yet academic skills are transferrable: philosophy PhDs are independent thinkers who can synthesize and handle large bodies of complex information, write persuasively as they apply for grants, and they can speak for diverse kinds of audiences.
How do those skills translate concretely into the non-academic job market? To get a clearer picture of this, I conducted interviews with 7 philosophers who work outside of academia. They are working as consultant, software engineers, ontologist (not the philosophical sense of ontology), television writer, self-employed counselor, and government statistician. Some were already actively considering non-academic employment as graduate students, for others the decision came later—for one informant, after he received tenure.
These are all success stories. They are not intended to be a balanced representation of the jobs former academics hold. Success stories can provide a counterweight to the steady drizzle of testimonies of academic disappointment, where the inability to land a tenure track position is invariably couched in terms of personal failure, uncertainty, unhappiness and financial precarity. In this first part, I focus on what kinds of jobs the respondents hold, and how they ended up in non-academic jobs in the public and private sector. Why did they leave academia? What steps did they concretely take to get their current position?
I hope this series of posts will empower philosophy PhDs who find their current situation less than ideal, especially—but no only—those in non-tenure track position, to help them take steps to find a nonacademic career that suits them. And even if one’s academic job is as close to a dreamjob as one can conceivable get, it’s still fascinating to see what a PhD in philosophy can do in the wider world.
There are several variants of a list in circulation with skills our grandparents could do but the majority of us can't, for instance, 7 skills your grandparents had and you don't. Examples include ironing really well, sewing, knitting, crocheting, canning, cooking a meal from scratch, writing in beautiful longhand, basic DIY skills... What have the majority of us lost by not having these skills, which I'll call granparent skills for short, anymore?
As Lizzie Fricker argued today in a workshop held in honor of Charlotte Coursier, trust in other people is common and is a pervasive element of human life. We defer to the knowledge of others (testimonial dependence) and to their expertise (practical dependence): we rely on experts to tell us what the weather will be like, to fix our car, to give us a new haircut. Often, this deference is shallow and dispensable (we could in principle do it ourselves), but it can also be deep and ineluctable, as when we rely on electricians and other specialists.
This division of cognitive labor provides us with enormous gains, but does an increased reliance on testimony and expertise of others also come with costs? Fricker feels we do not reflect enough on this question, especially as the extent of both testimonial and practical dependence seems have increased dramatically in recent years. People increasingly rely on Google rather than internally stored semantic knowledge, and they increasingly outsource practical skills – navigation with maps, dead reckoning, and compasses is replaced by user-friendly technologies like GPS devices.
I live very close to Port Meadow, one of the largest meadows of open common land in the UK, already in existence in the 10th century, and mentioned in the Domesday book in 1086. I saw my first-ever live, wild oriole there. The land has been never ploughed, so it is possible to discern outlines of older archaeological remains, some going back to the Bronze Age. The consistent management of the land makes the changes predictable: it turns into a lake in winter, is sprinkled with buttercups this time of year (see pictures below the fold - both are taken at about the same place, but one in May and the other in November), and looks mysterious and misty in the fall. Whenever I walk on Port Meadow I take my camera, anxious to preserve any beautiful view that falls on my retina, to preserve it for future memories. And, like many other parents, I take dozens of pictures of my growing children. Recently, I saw an NPR piece (no author given) that took issue with this tendency to want to preserve pictures for future memory.
The article launches a two-pronged attack against pictures. First, by worrying about capturing the moment, we lose the transience and beauty of the moment and enjoy it less. Second, the article cites psychological evidence that shows that people actually remember fewer objects during a museum visit if they were allowed to take photos of them, compared to when they only were allowed to observe them. The phenomenon is known as the photo-taking-impairment effect. Linda Henkel, who discovered the effect, says: "Any time…we count on these external memory devices, we're taking away from the kind of mental cognitive processing that might help us actually remember that stuff on our own."
Helen De Cruz has some excellent suggestions for how to talk to creationists given that neither debate nor denouncement are likely to be productive. She describes the way in which a religious person who is not a creationist can speak to another religious person who is a creationist, e.g., by pointing out that Biblical literalism is a recently emerged approach, one that may be impossible to apply consistently, and for that reason among others it may not be thoroughly used by anyone.
This article by Dan Kahan suggests that disbelief in human-caused climate change is like belief in creationism in this respect: What people "believe" about each doesn't reflect what they know, but rather expresses who they are. This supports the thesis that providing evidence for creationism isn't likely to change minds and that providing evidence for climate change isn't likely to change minds, either.
But what is the climate change equivalent, where we speak to people from their own perspective as Helen proposes that we do for religious people who are creationists?
A friend of mine is doing her DPhil in Oxford. She's American, and out of term she goes back to her home in middle America. She recently went to see the newly refurbished museum in her home town. When she was looking at the displays on human evolution, a museum guard, who had been observing her, suddenly said "So, what side are you on: the Bible or evolution?" Whereupon my friend replied "What do you mean what side am I on? This is not a football game, you know".
I am deeply troubled by the incipient creationism, which treats biblical literalism as a serious intellectual contender to scientific inquiry. I want my children to grow up with normal biology textbooks, not with Of Pandas and People. If creationists win their lobbying efforts to make creationism mainstream in schools and the public sphere, that is a loss for everyone (including the creationists). Debates don't seem to do any instrumental good. If we are not going to fight creationism through debates, how can we - as public intellectuals - ensure that creationism doesn't encroach even further upon our schools and public life?
On the basis of this year’s partial hiring data, Marcus Arvan notes that the majority of tenure track hires (a whopping 88%) are from people of Leiter-ranked programs. Only 12% of hires are from people of unranked programs. Also, 37% of all tenure track hires come from just 5 schools, the Leiter top 5 list - this is amazing if one ponders it, and one may wonder at the direction philosophy is going to, if most of its future tenured workforce comes from just a few select programs.
This has caused a lot of debate: why would people go to grad school in unranked programs at all? Why attend an unranked program if you can’t get into a highly ranked one? But what is often overlooked are the many factors, such as class and ethnic background, may contribute to someone not getting (or, as I will examine in more detail below), even applying to get into top programs. In fact, going for pedigree may be a particularly effective way to screen out people who come from poorer backgrounds and of different ethnicities.
A few weeks ago I had a post on different ways of counting infinities; the main point was that two of the basic principles that hold for counting finite collections cannot be both transferred over to the case of measuring infinite collections. Now, as a matter of fact I am equally (if not more) interested in the question of counting finite collections at the most basic level, both from the point of view of the foundations of mathematics (‘but what are numbers?’) and from the point of view of how numerical cognition emerges in humans. In fact, to me, these two questions are deeply related.
In a lecture I’ve given a couple of times to non-academic, non-philosophical audiences (so-called ‘outreach lectures’) called ‘What are numbers for people who do not count?’, my starting point is the classic Dedekindian question, ‘What are numbers?’ But instead of going metaphysical, I examine people’s actual counting habits (including among cultures that have very few number words). The idea is that Benacerraf’s (1973) challenge of how we can have epistemic access to these elusive entities, numbers, should be addressed in an empirically informed way, including data from developmental psychology and from anthropological studies (among others). There is a sense in which all there is to explain is the socially enforced practice of counting, which then gives rise to basic arithmetic (from there on, to the rest of mathematics). And here again, Wittgenstein was on the right track with the following observation in the Remarks on the Foundations of Mathematics:
This is how our children learn sums; for one makes them put down three beans and then another three beans and then count what is there. If the result at one time were 5, at another 7 (say because, as we should now say, one sometimes got added, and one sometimes vanished of itself), then the first thing we said would be that beans were no good for teaching sums. But if the same thing happened with sticks, fingers, lines and most other things, that would be the end of all sums.
“But shouldn’t we then still have 2 + 2 = 4?” – This sentence would have become unusable. (RFM, § 37)
I have been thinking about an analogy to the Bechdel test for philosophy papers - this in the light of recent observations that women get fewer citations even if they publish in the "top" general philosophy journals (see also here). To briefly recall: a movie passes the Bechdel test if (1) there are at least 2 women in it, (2) they talk to each other, (3) about something other than a man.
A paper passes the philosophy Bechdel test if
It cites at least two female authors
At least one of these citations engages seriously with a female author's work (not just "but see" [followed by a long list of citations])
At least one of the female authors is not cited because she discusses a man (thanks to David Chalmers for suggesting #3).
The usual cautionary notes about the Bechdel test apply here too. A paper that doesn't meet these standards is not necessarily deliberately overlooking women's work (it could be ultra-short, it might be on a highly specialized topic that has no female authors in the field - is this common?), but on the whole, it seems like a good rule of thumb to make sure women authors in one's field are not implicitly overlooked when citing.
In philosophy of religion, realist theism is the dominant outlook: belief in God is similar to belief in other real things (or supposedly real things) like quarks or oxygen. There is a rather triumphalist narrative about the resurgence of realist theism since the demise of logical positivism (see for instance, Plantinga's advice to Christian philosophers) when logical positivism and its verifiability criterion held sway, philosophers were dissuaded from talking about God in realist terms: religious beliefs were not just false, but meaningless. With the demise of logical positivism, however, theists could again defend realist positions, using a variety of sophisticated arguments.
Nevertheless, the question is whether theists in philosophers of religion are not conceding too much to atheists by talking about theism mainly in terms of beliefs. To ignore practice is to ignore a large part of the religious experience, and what makes it meaningful to the theist. Such an exclusive focus can indeed be alienating, as it seems to suggest that theists believe a whole bunch of ideas that are wildly implausible, e.g., that a man resurrected from the dead, or was born of a virgin. This picture of religious life as believing in a set of strange propositions is, as Kvanvig memorably put it, a view that most theists will not recognize themselves in:
I hardly recognize this picture of religious faith and religious life, except in the sense that one can cease to be surprised or shocked by the neighbor who jumps naked on his trampoline after having seen it for years.
That is not to say that many theists do believe these things, even in a literal sense, but without looking at the larger picture of practices that help to maintain and instil these beliefs, our epistemology of religion remains woefully incomplete.
It is therefore refreshing to read philosopher Howard Wettstein's recent interview in The Stone, who, coming from a Jewish background, emphasizes the practice-based aspects of a religious lifestyle. He argues that "existence" is the wrong idea for God, following Maimonides, and instead argues that "the real question is one's relation to God, the role God plays in one’s life, the character of one’s spiritual life."
In the recent Mind & Language workshop on cognitive science of religion, Frank Keil presented an intriguing paper entitled "Order, Order Everywhere and Not an Agent to Think: The Cognitive Compulsion to Make the Argument from Design." Keil does not believe the argument from design is inevitable - I've argued elsewhere that while teleological reasoning and creationism is common, arguing for the existence of God on the basis of perceived design is rare; it typically only happens when there are plausible non-theistic worldviews available.
Rather, Keil argues that from a very early age on, humans can recognize order, and that they prefer agents as causes for order. Taken together, this forms the cognitive basis for making the argument from design (AFD). (For similar proposals, see here and here). He proposes two very intriguing puzzles, and I'm wondering what NewApps readers think:
Some forms of orderliness give us a sense of design, others do not. What kinds of order give rise to an inference to design, or a designer?
Babies already seem to recognize ordered states from disordered states. How do they do it? What is it they recognize?