How has the academic job market for philosophers changed in the recent past? I noted last year that it looked as though fewer tenure-track or equivalent jobs were being offered year to year from 2013 to 2015. This job market has just started, and if we look at the period of August 1st to October 20th in 2013, 2014, 2015, and 2016, 2016 seems no worse off than 2015, but both years have fewer jobs listed in that period than in the previous two. To get a more complete picture, I decided to compare tenure-track jobs listed over the full year, from January 1st to December 31st, to the listed graduates in APDA from 2013 to 2016. Two things are worth noting here. First, 2016 is not yet complete, so there are fewer jobs listed in that year for that reason alone. (I suspect its numbers will end up being similar to those for 2015.) Second, PhilJobs has a great many job listings, but not every job listing, so the number of tenure-track jobs are likely somewhat higher in reality (but I do not have an estimate of how much higher). Keeping those details in mind, this comparison doesn't look too bad at first, with 1,486 graduates to 939 jobs, or a TT placement rate that could be as high as 63% (if all of those jobs went to graduates of that time period, which is very unlikely as many of these jobs are "open rank" searches).
APDA has released this year's APA report and has added an application to the website (but we are still working on its auto-update feature, so the data it represents is a few days old as of today). In keeping with our program-specific reports released in April, here are some basic charts on the programs we covered at that time. These are raw numbers for graduates between 2012 and 2016, with only the APDA database numbers reflected in the first two graphs (here and here), whereas the third graph (here) makes use of external graduation data in its "unknown" category (see the note on the 4th graph for details). At the bottom of the page is a sortable chart with percentages for these categories. We have not yet started checking new data (program representatives have added over 400 graduates since August 15th), so there may be some errors (including those noted here). We are currently working on writing up some results from the survey into one or more papers, which should be available sometime in early 2017. Feedback is welcome!
As noted in the APDA update posted over a week ago, we are in the middle of two important projects:
We are adding individual editing to the website in May 2016. Up to March 2016, placement data were edited by project personnel, placement officers, or department chairs. In the future, individual graduates will have the option to claim their entry. To do this, we require a contact email for the graduates in our database. We currently have email addresses for roughly one quarter of the database. For graduates: to ensure that you are included among those who have access to individual editing, please provide your email address here: http://goo.gl/forms/mXUbpeH5ic
Along with individual editing, in May 2016 we will add a brief qualitative survey for graduates. We will use linguistic analysis to compare these responses across graduates, connecting them to metadata on graduating institution, gender, graduation year, area of specialization, and placement type. Participants will be compensated for their time. Again, to do this, we require the contact email for the graduates in our database. For graduates:To ensure that you are sent the qualitative survey, please provide your email address here: http://goo.gl/forms/mXUbpeH5ic
Please feel free to send the form to past philosophy graduates you know who may want to be included! As it says above, time they spend filling out the qualitative survey in May will be compensated (by a $50 Amazon gift card raffle for every 50 participants). And note: it is our policy to treat the email addresses as private and accessible only by project personnel.
This is a brief notice that APDA has finalized its update for the 2015 report. Here is the report from 2015 and here is the update. Please contact me (cjennings3 at ucmerced.edu) with feedback or leave comments and suggestions below.
Update: I replaced one of the links as I noticed that the AOS table had been mismatched to the gender table.
Update (4/15/16): I will list here errors that are discovered in the data/report:
University of Washington--4 grads (2 2012, 2 2014) should be listed as temporary academic, but are currently permanent (but non TT) academic.
University of Texas, Austin--placement records are missing several graduates and should be checked against the placement page (the placement page was down when we attempted to check it in November).
University of Arkansas--this program was not contacted for data and should be included in future reports.
As I noted in a previous post, APDA is in the middle of finalizing data for a new report. This will be a follow up to the report released in August 2015. We hope to include data on graduates with no listed placements and Carnegie Classifications, among other improvements. It is our aim to release the new report by April 15th, so that it can be useful to those who have applied to graduate programs this year. (Until that time, editing on the site has been turned off so that we can verify and analyze the data. We will turn back on editing in May when we turn on a new feature to allow for individual editing.)
In preparation for that report, I have been trying to determine the best way of displaying our data. I am attaching four DRAFT images that present data for 104 universities using pie charts (on gender, AOS, job type, and graduation year: gender and AOS use data from APDA alone, whereas job type and graduation year also uses graduation information from outside APDA, discussed in this post). I used pie charts because they are visually intuitive and I want the data to be as accessible as possible. I used suggestions from this post to help avoid some common criticisms of pie charts. (Note: I tend to analyze data in R, using ggplot2 for graphs, which is the language I provide below for anyone with expertise in this area.) At the top left of each image are the data for the full set of 104 universities. (Universities are included only if we have both an external source of graduation data and placement records for that university with recorded graduation years in this time period.)
I am looking for feedback on these charts. Are these easy to understand? Are there alterations that would be beneficial? Two other options, with images below: 1) Replace pie charts with bar graphs (one sample version below). 2) Make university-specific sets of charts. (This is more time-intensive than 1.)
Note also: We aim to release tables and regression analyses, as we did last time, and any images we release will be in addition to that work. Your input is welcome!
The Academic Placement Data and Analysis project (APDA) hopes to release program specific placement rates in the next week or two (before April 15th). These placement rates compare placement data to graduation data, so good graduation data are crucial. Yet, finding consistent graduation data is surprisingly tricky. The project currently uses the following external sources:
We gather data from multiple sources because each data set is incomplete, and for different reasons. For instance, the Survey of Earned Doctorates gathers data from programs in the United States alone, while the American Philosophical Association collects data from programs in the United States and Canada. Since the Review of Metaphysics publication supplied names we were able to integrate this information into APDA. For the other three sources we compiled the number of graduates for the years 2012-2015 into a single spreadsheet, assuming the later of the two years when a range was provided (e.g. 2011-2012). If I remove the programs that had missing data from all three remaining sources (SED, PhilJobs, APA) then we have data on 105 universities. How do these sources compare to one another and to the data contained in APDA?
Due to the suggestion of Lionel McPherson in comments at this post, I am disaggregating the non-white category of this previous post into three lists: "Hispanic," "Asian or Pacific Islander," and "Black" graduates of PhD programs in philosophy, per graduating institution. Importantly, the data only cover permanent residents and citizens of the United States (thanks to Brian Weatherson for pointing this out). Because of that fact I use data from the United States census as a point of comparison above each list.
Note that the data on graduates was provided by the National Center for Science Engineering Statistics thanks to Eric Schwitzgebel's efforts (see here and here). Specifically, the NCSES supplied the number of racial and ethnic minority graduates from doctoral philosophy programs in the United States between the years 1973 and 2014 (but not broken down by year).
Below, I list the programs in the United States with a higher than average (mean) percentage of graduates from each of these categories, where the mean is taken for 96 programs in the United States (I omitted institutions from the NCSES data that no longer offer doctoral degrees in philosophy)...
Most of us know about efforts to sort philosophy programs according to placement rate or prestige, but what of the percentage of PhD graduates from each program who are women or other underrepresented minorities? Thanks to Eric Schwitzgebel's efforts in contacting the National Center for Science Engineering Statistics (see here and here), we have access to some numbers on this issue. Specifically, the NCSES supplied the number of women and minority graduates from doctoral philosophy programs in the United States between the years 1973 and 2014 (but not broken down by year). Below, I provide the top programs in the United States from this list of 96 programs in terms of % of women graduates in this period, as well as the top programs in terms of % of non-white graduates, where for "non-white" I am aggregating the NCSES categories of "Hispanic," "Asian," "Asian or Pacific Islander," "Black" and "two or more races." (I omitted institutions from the NCSES data that no longer offer doctoral degrees in philosophy.) One striking feature of these lists is how many of the programs show up on Brian Leiter's list of PhD programs "whose existence is not easy to explain." A provocative rhetorical question follows: Should we be closing PhD programs that better serve women and minorities in philosophy? I welcome discussion below.
I recently joined Twitter and uploaded some quick attempts to sum up what has been happening with job ads on PhilJobs this year compared to a couple of past years. I noticed, first, that there are fewer job ads this year so far than in previous years, at least on PhilJobs (with some nice caveats provided in comments here). Second, looking at first AOS, the most sought-out area of specialization this year differs from previous years. While in my initial tweet on this I said that value theory appeared better off than other areas of specialization this year, that was based on a mistake. (You can check out the Excel file I used for 2 and 3 if you want to help me identify other potential mistakes. 1 is based on PhilJobs searches, not a csv file.) In terms of percentages, all areas of specialization are down this year since open searches are up, relative to last year. I take this increase in open searches to be a good thing, in terms of potentially increasing the intellectual diversity of philosophy, but I am interested in what others think about this. Third, if you look at the full AOS listing for job ads, certain words are more frequent this year than you might expect, given the first AOS listing, such as "science." Finally, if you look at the first-reported AOS of the bulk of the placed candidates in the APDA database, the AOS balance is different yet again (favoring LEMM over history and traditions, for instance). (In the future, I can break this down by TT placement year, but I didn't have time to do that for this post.) These are initial numbers, and the season just started, but I think this is a space worth watching. Here are some numbers and images (with 2015 highlighted in yellow):
In the coming weeks I hope to be updating you with more details and analyses, but for now I am simply announcing that the final report for APDA is complete. Feel free to ask questions or comment below.
*Update: we noticed an error in one of the charts and some potentially confusing language in that section, so we have updated the report at the link.
A few weeks ago I posted some details about a new project: Academic Placement Data and Analysis (APDA) here. Readers may be interested in some updates to that project. Note: We are sending out emails to program representatives over the next few hours with much of this information, including an extended collection goal date of July 22nd, 2015. The original blogpost is quoted below.
1) Total Placement Records
"There are approximately 2300 total entries, with several categories of data."
As of noon on July 13th, we had 3078 placement records for 2444 people--that is 573 more placed candidates than we had in the database on June 23rd. (In comparison, PhilJobs, the next most comprehensive database, had 2307 placement records for junior hires at that same time.)
Academic Placement Data and Analysis (APDA) is a new, collaborative research project on placement into academic jobs in philosophy. The current project members include myself, Patrice Cobb (psychology, UC Merced), Angelo Kyrilov (computer science, UC Merced), David Vinson (cognitive science, UC Merced), and Justin Vlasits (philosophy, UC Berkeley). This project is borne out of earlier work on placement that was posted here and elsewhere over the past few years. Funding for this project by the American Philosophical Association has so far provided for the development of a website and database that can host the data for this project (thanks to the work of Angelo Kyrilov over the past two months). There are approximately 2300 total entries, with several categories of data. Most of these categories of data have been made publicly available, whereas any categories that have not been made public (e.g. name, gender, race/ethnicity) will be provided to researchers with IRB approval from their home institutions. You can see the website and database so far here:
When I first looked at placement statistics at the Philosophy Smoker I performed some analyses that I shouldn't have. First, I performed too many analyses. Second, I used the wrong kinds of analyses for some of the data. I did not imagine that these statistics would take off as they did and I was overworked*, which contributed to some mistakes on my part. One of these mistakes was running correlation analyses over gender:
I also found a negative correlation between PhD granting institution and number of publications (-.17: the lower your PhD granting institution is ranked the more peer-reviewed publications you have) and between gender and number of publications (-.21: if you are a man you likely have more publications than if you are a woman).
While at the time I suspected that this negative correlation had to do with the increased difficulty women have in publishing their work, others worried that women had an upper hand on the job market. I brushed off this latter worry because the proportion of women who found tenure-track jobs was about the same as the proportion of women who obtain PhDs in philosophy. In fact, in the 2011-2014 data set I found that there is not a significant difference between the proportion of women who graduate from each department and the proportion that find tenure-track jobs from each department (but there is a significant difference for postdoctoral/VAP/instructor positions, which are awarded to a smaller proportion of women relative to women graduates). But this worry regularly comes up in comments and I feel a responsibility for having possibly led people astray with analyses I shouldn't have used in the first place. For that reason, I want to provide some more appropriate analyses here, as clarification on the relationship between gender and publications in the placement data from 2011-2012 and 2012-2013. Those who want to check this work can use the spreadsheet at the bottom of the post here, which is the one I used. (I do not use the more recent data because I decided not to collect publication data in this last round, due to time constraints.)
I have read in several places this description of my placement post and my response to Brian Leiter's criticisms of that post (most recently, in comments posted yesterday at Philosophical Comment):
"July 1: I posted a sharp critique of some utterly misleading rankings produced by Carolyn Jennings, a tenure-stream faculty member at UC Merced. She quickly started revising it after I called her out."
For the record, this does not strike me as an accurate representation of those events.
First, while I did post a ranking, I made it clear that I did this as an exercise: (from the original post, bold original) "As discussed here in the comments, one of the advantages of comparative data on placement is that they help fill in gaps left over by the PGR...To illustrate this, I below rank the top 50 departments by tenure-track placement rate**, providing for comparison these department's ranks from the 2011 "Ranking Of Top 50 Faculties In The English-Speaking World"by the Philosophical Gourmet Report. Please note that this placement ranking is provided only to demonstrate the potential utility of these data."
Second, while Brian Leiter did find the rankings misleading, many others did not, and even commended the clarity of language in my post. Take these quotes from David Marshall Miller, who has also worked on placement data: "Andrew Carson and, especially, Carolyn Dicey Jennings have developed analyses that now strike me as very robust." and "I will say, to again quote Leiter, that “all such exercises are of very limited value.” Nevertheless, they are of some use, and should be made available, so long as the methodology and limitations of the analysis are made clear. I think the PGR and the placement rankings by Jennings, Carson, and myself all meet this standard."
Third, Brian did post criticisms of the ranking, but I did not make any substantial revisions to the ranking based on his criticisms, since I did not find those criticisms to have merit. Brian's way of characterizing my response at the time was "Prof. Jennings digs in her heels."
Over the past three years I have collected and reported on placement data for positions in academic philosophy. (Interested readers can find past posts here at New APPS under the "placement data" category, two of which have been updated with the new data, severalpostsatProPhilosophy, or the very first post on placement at the Philosophy Smoker.) This year, placement data will be gathered, organized, and reported on by the following committee of volunteers (listed in alphabetical order):
Over the next academic year, we aim to create a website, which will be parked at placementdata.com. This website will include a form for gathering data, a searchable database, and reports on placement data. Until that time, I am suspending updates to the Excel spreadsheet, which contains much of the data used in the past few years, plus the updates I have received over the past few months. (Many thanks to Justin Lillge for incorporating the bulk of these updates into the spreadsheet!) When the website is ready, departments will be able to update their placement data through an embeddable form. Stay tuned for these links in the coming months!
Marcus Arvan, of The Philosophers' Cocoon, had the idea of running a graduate student survey. This was something that the five of us had already talked about (and Justin Lillge had some preliminary work on this), so we have invited Marcus to join us in this project. He has posted s0me initial ideas here. Please do contribute to the discussion if you have insight!
A few days ago I posted a list of features that I take to be essential to an ideal report on placement, seeking comments and suggestions. One of the features I mention there is recency. All departments are likely to place more candidates given more time, but this slope is steeper for certain departments. Moreover, placement varies year to year. Thus, one's choice of time frame can substantially alter data on placement. This is the reason that Brian Leiter's numbers for NYU look better than mine (here and here)--I looked at the years 2012 to 2014 (3 years in the recent past), whereas he looked at the years 2005 to 2010 (6 years in the distant past).* Looking at NYU's placement page, one can easily see that the percentage of graduates placed in tenure-track jobs drops as one reaches the present. As I said, this is likely true for all departments. This means that if you look at data in the distant past, it might not matter what the length of the time frame is, but if you look at data ending in the recent past, the length of time frame makes an impact. That is, for NYU for the years starting in 2005, a 6-year time frame has 87% TT placement, a 5-year time frame has 90% TT placement, a 4-year time frame has 88% TT placement, and a 3-year time frame has 90% TT placement. But for the years ending in 2013, a 6-year time frame has 69% TT placement, a 5-year time frame has 65% TT placement, a 4-year time frame has 56% TT placement, and a 3-year time frame has 56% TT placement. Note that even the 6-year window ending in 2013 is associated with much lower placement than any of the windows starting in 2005. It seems obvious to me that we should favor more recent data, since they reveal which departments place students more quickly than others and since they are more relevant to students looking at graduate programs. Beyond that, it is not obvious just what length of time we should choose (3, 4, 5, or 6 years) or just which year we should use as the endpoint.
Yet, one's choice of time frame has a large impact on comparative placement data. Let's compare NYU's placement page to the placement pages of those departments that I found with these methods to have the highest tenure-track placement rates: Berkeley, Princeton, Pittsburgh HPS, and UCLA. If we look at NYU's worst time frame it comes out behind all the others (2010-2013: NYU 56%, UCLA 59%, Berkeley 63%, Princeton 65%, and Pittsburgh HPS 88%). If we look at NYU's best time frame it comes out ahead of all the others (2006-2009: NYU 94%, UCLA 67%, Berkeley 78%, Princeton 86%, and Pittsburgh HPS 93%). If, on the other hand, we look at multiple time frames then a new type of comparison is possible. We can determine, for example, which department has the least low value for tenure-track placement, given any time frame in the period from 2005 to 2013 (with a 3-year minimum time frame and a 6-year maximum time frame). In that case, Pittsburgh HPS comes out on top. It's lowest value is 85%. In comparison, the lowest value for Princeton is 65% (2010-2013), the lowest value for Berkeley is 59% (2009-2012), the lowest value for UCLA is 52% (2009-2012), and the lowest value for NYU is 56% (2010-2013). So if we look at the least low placement for all of these time frames, NYU comes out second to last. Finally, if we look at the full range, from 2005 to 2013, NYU comes out in the middle (Pittsburgh HPS 93%, Princeton 76%, NYU 74%, Berkeley 70%, UCLA 65%).
Suffice it to say, these decisions make a substantial impact on one's results. For that reason, one should attend carefully to justifications on recency and time frame. I will remove the links to Brian Leiter's two posts on placement data here, since I am concerned that they will mislead students. If I had written those posts, I would certainly take them down knowing what I have made clear in this post (i.e. that the numbers for NYU are inflated for the very time frame that Brian Leiter chose to look at, relative to other departments). I have emailed Brian a link to this post.
As for my data, I use the years 2012 to 2014 because those are the most recent years and the years for which I have large data sets. (ProPhilosophy was kind enough to email departments directly in 2012 and 2013, which substantially increased the number of reported hires for those two years.) To go prior to 2012 I would have to either look at individual placement pages for all 118 departments, many of which do not have data of the sort I need, or use what I know to be a skewed sample from the Leiter Reports blog. I have made clear that any rankings I produce are a work in progress and should not be taken as authoritative. (That is one reason I post them to blogs, and not an independent website.) But as time goes on and this process is improved I will have to start making decisions about which time frames matter. I may well follow the lead of David Marshall Miller in reporting multiple time frames, since this might be helpful for students. Suggestions on this point are welcome. (The data that I used for this post are after the break. Feel free to suggest corrections where needed.)
*I hope that this does not need saying, but I am not picking on NYU here. One of my dissertation advisors was at NYU and one of my best friends is currently a student there. I am looking at NYU because it appears to be a focal point in Brian Leiter's criticism of my work. If one were to look at other measures beyond just tenure-track placement, NYU may well fare better than it does here.
Update (7/14/14): In order to satisfy the worry that NYU is particularly burdened by graduates of the JD/PhD program in this measure (2 graduates from NYU left academia for law in this time period, compared to 1 from Princeton, 3 from Berkeley, and perhaps 2 from UCLA), I compared NYU to these other programs while leaving out all those graduates who left academia. In that case, as I point out in the comment below, it is still clear that time frame matters and, in particular, that the time frame of 2005-2010 overly inflates NYU's record (2008-2013 puts NYU in the middle of the group, at 80%, whereas 2005-2010 puts it at 95%, square with Berkeley and Pittsburgh HPS, ahead of UCLA and Princeton. It might be worth noting that with the same methods Fordham University placed 69% of its graduates into tenure-track jobs between 2008 and 2013). See my comment below for details.
I applaud Brian Leiter's efforts to examine placement data in the past few days *Update 6/13/14: I have removed these links because I think that Brian Leiter's posts have the potential to mislead students. See my new post here*, as well as the efforts of David Marshall Miller and Andy Carson over the past few years. All of this is effort to improve the profession and deserves recognition as such. I plan to continue reporting placement data next year and will likely post the report to an independent website. Below is a list of features that I take to be essential to an ideal report on placement, together with some ideas for improvement on my own work. Please comment below!
1) the original data: as far as I know this is missing from both Brian Leiter and Andy Carson's efforts. This is important because it keeps the analyses honest by opening them up to public scrutiny. I have provided links to my data and will continue to do so. Recommendations on format are welcome here.
2) the methods: key information is missing in Brian Leiter's presentation, such as the criteria for determining which placements are to "research universities and selective liberal arts colleges," but as far as I can tell David Marshall Miller and Andy Carson are clear and up front about their methods. I have tried to be clear about my methods, but I have received some emails that reveal shortcomings here. Recommendations welcome.
3) completeness: Brian Leiter's efforts, as of this moment, include only a few departments (that were not selected at random). An ideal report should include all the philosophy departments that have made placements of the type in question, which is something David Marshall Miller, Andy Carson, and myself have all tried to do. What is missing from all of our reports is complete placement data. PhilAppointments is not a complete source, for example, but neither are placement pages. Further, placement pages are often missing key data points on placement (such as names, which help to identify duplicate candidates). Next year I aim to cross-reference PhilAppointments with individual placement pages. Recommendations on how to efficiently improve completeness are welcome.
4) recency: since these efforts are in their infancy, it is currently unknown what time frames are relevant. Recent data are ideal, so long as recency is balanced with completeness. Brian Leiter chose a 5-year time frame between 2005 and 2010, which I see as a drawback of his report. Although David Marshall Miller, Andy Carson, and myself have all used the most up-to-date data, David Marshall Miller also looked at different time frames. In the future, with more data, the use of time frames should help us to determine how recent our data needs to be. Recommendations on how to proceed with time frames is welcome here, since next year the data set I have will be in its fourth year (2011-2015).
5) neutrality: Those collecting, analyzing, and reporting the data should be as neutral as possible with respect to hypotheses and results. I have concerns about this with respect to Brian Leiter's report, especially given the absence of 1 and 2. The fact that David Marshall Miller, Andy Carson, and I have performed this work on our own is also potentially problematic, even with the inclusion of the original data and methods. Over the next year I plan to form a task force to work on placement data, composed of several people who have reached out to me over the past week or so (but others are welcome). Having more people on the project should help with neutrality. Recommendations on this point are welcome.
As promised, here is the link to the data set I have been using in the placement posts. Most of you will probably be most interested in the "Department Trends" tab. If you find that data should be added, please email me with the following information, preferably in order and separated by commas OR add the relevant information to PhilAppointments, which I will use to update this data set from time to time:
1) Year of placement
2) Name of placed candidate
3) PhD-granting institution of placed candidate (and department, where relevant)
4) Type of placement and name of hiring institution
As discussed here in the comments, one of the advantages of comparative data on placement is that they help fill in gaps left over by the PGR. That is, the PGR aims to measure the collective reputation of a department's faculty, but faculty reputation does not necessarily predict the likelihood of placement by that department, perhaps because it does not necessarily predict the overall quality of education in that department nor the quality of preparation for the job market by that department. Comparative data on placement has the potential to provide insight on these factors. To illustrate this, I below bracket the top 50 departments by tenure-track placement rate** (Note: I removed three universities from the top 50 that reported fewer than 2 graduates per year, since small numbers may yield misleading placement rates), providing for comparison these department's ranks from the 2011 "Ranking Of Top 50 Faculties In The English-Speaking World"by the Philosophical Gourmet Report. Please note that placement brackets are provided only to demonstrate the potential utility of these data. Since the data set is not yet complete, I do not recommend viewing these as authoritative brackets.Update:Please see this post for an idea of how I envision this project developing.I have released the spreadsheet containing the raw data and methods I have been using to compute these results, and welcome any/all corrections. As a reminder, I do not have data on the yearly graduates from many departments, listed below. (Those departments are welcome to send me their data, if available.)
Update 7/1/2014: It has come to my attention that Brian Leiter has aired some criticisms of this post on his blog and has publicly suggested that it (this post, not his blog) be taken down. I respond to these criticisms below.
I changed some wording above from "ranking" to "brackets" and added a link to the spreadsheet. I have also changed the numbers in the below ranking to a grouping by bracket (where departments are listed in alphabetical order within brackets). This was a suggestion of Ned Block's. We have been corresponding on statistical significance and I decided that his suggestion would help avoid making small differences between placement rates appear more important than they are. I have left in the PGR rank for comparison, although the difference in rank has been omitted for the reasons provided above.
I have also added updates to my responses to Brian, based on some new statistical tests.
I am adding a link to a chart that will help readers to visualize the total number of reported tenure-track placements and estimated graduates from each department, rather than just percentage of tenure-track placements.
Update 7/6/2014: I ran a completeness test for 5 departments selected at random using a random number generator. The tenure-track numbers for these 5 departments appears to be accurate. More below.
As discussed in the comments at a previous post, I have been looking at department-specific placement rates. "Placement rate" is the number of reported placements*** divided by the number of graduates. I looked at reported placements between 2011 and 2014 and graduates between 2009 and 2013. I do not have data on many departments that reported placements in this time frame**, but of those 94 departments for which I do have data, 32 appear to have placement rates higher than 50% for tenure-track jobs and 51 appear to have placement rates higher than 50% for a combination of tenure-track, postdoctoral, VAP, and instructor jobs (both sets are listed below).****
Update: I have removed the following departments from both lists because I do not have updated graduation data from them: University of Chicago, University of Pennsylvania, and Yale University. These departments may well have placement rates as high as these others, but the graduation data I have from them comes from the 2012 APA Graduate Guide, since they did not complete the 2013 APA Graduate Guide. If the department chairs respond to my email of June 10th with updated information, I will update their status.
In two previous posts I have provided data on gender and AOS for placements reported at ProPhilosophy (2011-2012 and 2012-2013) and PhilAppointments (2013-2014). As of today, I have data on 729 placed candidates. In this post I aim to use this and other data to estimate the total number of candidates seeking employment and to calculate an approximate overall placement rate.
In keeping with the earlier post on gender, this is an overview post on the distribution of (first-listed) areas of specialization among placed candidates. I now have data on 722 candidates who have been placed in tenure-track, postdoctoral, VAP, or instructor positions between late 2011 and mid 2014 (ending today), drawn from ProPhilosophy (2011-2012 and 2012-2013) and PhilAppointments (2013-2014). I aim to make the spreadsheet with this data available by around July 1st (I will continue to add new data until that date).
I have data on 715 candidates who have been placed in tenure-track, postdoctoral, VAP, or instructor positions between late 2011 and mid 2014 (ending today), drawn from ProPhilosophy (2011-2012 and 2012-2013) and PhilAppointments (2013-2014). I aim to make the spreadsheet with this data available by around July 1st (I will add any new data available by that date). Until then, I will report some initial findings, starting with gender.
Yesterday I posted data for tenure-track placement from this past year. The data below include postdoctoral, VAP, and instructor hires sourced from PhilAppointments. Please check the data and make corrections in comments or by email (cjennings3 at ucmerced dot edu).
Last year I posted some statistics on tenure-track, postdoctoral, and VAP placements between 2011 and 2013. I aim to continue these analyses for a third year. Along the way, I will post progress on data collection, in the case that corrections are in order. The data below include tenure-track or equivalent hires sourced from PhilAppointments (I will provide a new post with postdoctoral and VAP data soon). Please check the data and make corrections in comments or by email (cjennings3 at ucmerced dot edu).
Marcus Arvan at the Philosophers' Cocoon posted sample data from the new appointments site at PhilJobs, which is discussed in a great post by Helen de Cruz here at New APPS. In comments at de Cruz's post and in a new post Arvan discusses the impact of Gourmet ranking on women and men seeking tenure-track jobs. I wanted to follow up on Arvan's post by looking at the full set of data currently available at PhilJobs. I did this in part because I knew that the sample Arvan collected was skewed on gender, due to an earlier analysis on gender I performed for a comment on a post at the Philosophy Smoker. With that convoluted introduction aside, here is a summary of the findings, in keeping with the findings by Arvan: the gourmet rank of one's PhD granting institution appears to have a greater impact on men seeking tenure-track jobs than on women seeking tenure-track jobs. Although I cannot yet speak to the source of this discrepancy, I (like Arvan) find the difference troubling. I welcome comments on the source of the difference below, although any comments will be subject to moderation. Let's look more closely at the data (Note: the linked spreadsheet was updated on May 14th):
The much anticipated appointments page at PhilJobs is now live (see this announcement from the APA). To encourage the use of this service, we will be suspending the hiring thread on NewAPPS. I want to commend this effort by the APA, David Bourget, and David Chalmers, which will certainly be a helpful addition to the profession.
This is just to note that the links for reporting tenure-track, postdoctoral, and VAP hires from 2013-2014 have been placed in the upper-right sidebar of this blog. This should facilitate the reporting and monitoring of this information. Further, both Daily Nous and ProPhilosophy have plans to integrate the information into their sites in an easier to view format. Thank you to all of the commenters at the original posting and to all those who have already stepped up to help with this effort.
If you would like to report hiring information from 2013-2014, please fill in the form at this link; the data entered there feeds into a spreadsheet available here. Quite a bit of hiring information is already available at Leiter Reports, here.
UPDATE 8 March 10:30 am CDT: This form and spreadsheet need not be limited to this NewAPPS post. If any other blog would like to link to it, they are welcome. In that case, I would be happy to make the relevant bloggers co-owners of the Google documents in question. Ideally, the information would be available in a neutral location, but having the links posted to several different blogs would come close to that.
By clicking the link posted below you can download an Excel spreadsheet with placement data from the last two years, 2011-2012 and 2012-2013, together with a “how-to” guide for future years of data gathering and analysis. The data is sourced from ProPhilosophy, the link for which you can find here.