As is well known, philosophy is a very male-dominated (and white, straight, etc) field, when compared to all other humanities, social sciences, and even several STEM disciplines. Even if we take into account the difficulties that minorities face in academia, we cannot explain why philosophy does worse than most other academic fields. I'd like to put a slightly controversial idea on the table: there are good reasons to believe that philosophers are less effective than academics from other fields in their ability to counter their own biases, i.e., they exhibit a larger bias blind spot.
The bias blind spot was first experimentally demonstrated in this study. As is well known, people tend to exhibit the better-than-average bias, they think they are better drivers, more intelligent, sensitive, etc compared to the average person. It turns out that people who exhibit this bias persist in it, even if after reading a description of the bias in question and how it could affect them. Ironically, participants did not revise their judgments, but even believed themselves to be better than their peers in evaluating their own merits, thinking themselves, but not others, free of bias.
Recently, a study found that the bias blind spot does not attenuate with cognitive ability. The authors found that more cognitively sophisticated subjects in fact had larger bias blind spots. Moreover, they found "no evidence whatsoever for the notion that people who are more aware of their own biases are better able to overcome them". So it is not so surprising that the anonymous contributor of this blogpost found his or her judgment to be affected by a sexist letter of recommendation. Philosophers are cognitively sophisticated individuals. For example, pursuing a philosophy major has benefits for one's analytic and verbal skills. If West et al. are correct, this increased cognitive sophistication may make us especially vulnerable to the bias blind spot, making it more difficult for philosophers to overcome biases.
More perniciously, consciously reflecting on one's own biases, and having received the opportunity to show how unbiased one is, can even increase bias. This review paper shows how people who can first show or explicitly voice their lack of bias are subsequently more biased than those who do not have this opportunity. I quote a lengthy passage that is quite remarkable:
Monin and Miller (2001) let some participants demonstrate their lack of prejudice before presenting them with this police-force scenario by asking them to play the role of an employer choosing which candidate to hire for an unrelated consulting job. The best-qualified candidate happened to be African-American and the other four were White. Nearly everyone selected the African-American candidate, a choice that presumably made them feel that they had established themselves as nonracists as they went into the second part of the experiment. In the control group, all five candidates were White, so control participants did not get a chance to demonstrate a lack of prejudice. As predicted, participants who had been able to demonstrate their nonprejudiced attitudes in the first hiring decision said that the police job was better suited for a White person than people in the control condition. Analogous results were obtained in the domain of sexism: the opportunity to disagree with blatantly sexist statements or to pick a woman for a consulting job made participants more likely to describe a stereotypically masculine job as better suited for men than for women. It thus appears that the opportunity to obtain a moral license freed participants from the anxiety that goes along with making morally ambiguous decisions.
Given that philosophers often reflect on things, even when not engaged in philosophy, it seems plausible that they will engage in this sort of moral self-licensing discussed here. To give a few anecdotes: I have heard a fellow philosophers who failed to find a female keynote for a conference make the following reasoning: "Well, I made an effort. I just invited two female keynotes and they both declined. I've done all I could for making this a conference with gender balance. In fact, I've already gone a lot further than most other people have done".
More disturbingly, this was written in a note to members of an almost all-male philosophy faculty (I'm paraphrasing)
Over the past searches, we've done an effort to shortlist more women. However, almost all of our last hires were men. Our biggest problem is that very few women apply for our jobs. We cannot afford to put the bar lower for women and engage in affirmative action. Rather, we will be more proactive in our recruitment of female candidates in future searches, and hope in this way to attract female academics who can serve as role models for our female students.
(alas, the authors of the memo did not undertake any steps toward this, and the hires since this note was published have all been male).
So what can we do? Apparently, being aware of one's bias doesn't decrease it, and can even result in a form of moral self-licensing. Thinking that you know all about biases in grading but continuing to grade non-anonymously, because, after all, you know about them, is not going to improve things. Given the importance of extended cognition in overcoming our cognitive limitations, the most effective way to overcome them is to structure our environment in such a way that such biases can play a lesser role. Anonymous grading, active mentoring programs (not just the informal mentoring that takes place in bars and in sporting events over the weekend), and trying to increase the presence of women at conferences and in volumes are very effective.
Recent Comments