So you want to be a history professor

NB: In India nowadays there’s a separate problem. You need to be in agreement with the world view of the ruling dispensation – whether or not you know any history. DS

David A. Bell

A historian friend once told me that when he went on the job market, he put in three applications and received five job offers. That was in the early 1960s, during the heady years of post-war economic expansion and university expansion. Ten years later, both expansions abruptly ceased, and the academic job market crashed. It recovered somewhat after the mid-1980s, although with frequent downturns. In the past few years, it has crashed again to new lows.

Jon K. Lauck, a historian and editor in chief of the academic journal Middle West Review, recently surveyed the sorry state of the field. Of the 1,799 new historians who received Ph.D.s in 2019 or 2020, only 175 had landed full-time faculty jobs in history as of last fall — and it is not clear how many of those are tenure track. The number of undergraduate majors in history has plummeted. Lauck traces departments that are being hollowed out: The University of Kansas history faculty is down from 35 members in 2017 to 24; the Ohio State University system’s history faculty has fallen from 79 members to 62 since 2008; Iowa State University’s history department has been told by administrators that its faculty must shrink from 20 members to 8. All of this has consequences, as Lauck details:

These days, some of the conferences I used to attend and greatly enjoyed have been canceled entirely. History-journal editors also whisper about what they are seeing. Article submissions used to stream in at steady clip. Now the pipeline is but a trickle. Prominent history professors, who once anchored departments and enlivened the public sphere in and around college towns, now retire with little fanfare and nary a replacement. Their “line,” if it survives at all, is moved across campus, to computer science or physical therapy.

In my area, French history, the numbers tell the same story in miniature. Since 2010 I have been tracking the number of North American tenure-track jobs my advisees can reasonably apply for each year — searching the postings for everything from “history” to “Europe” and tracking down specialized jobs in modern and early-modern France. In 2010-11, there were 43 available tenure-track positions, and after a dip throughout much of the 2010s, it returned to 42 positions in 2017-18. But the next year it crashed to 18 positions, and during the pandemic year of 2020-21, it fell further, to just 8. This year, so far, there are 9 available positions (this figure only counts full-time tenure-track jobs at U.S. and Canadian four-year colleges).

The pattern also tracks with my experience as an adviser. Of my 10 Ph.D. students who defended their dissertations before 2016, all but one got a tenure-track job (and the one who didn’t limited the job search to a single metropolitan area for personal reasons). Of the eight who have defended since then, only one has so far gotten a tenure-track job. Five of these eight have landed very competitive postdocs, so the problem is clearly not with the students. But will jobs be there when the fellowships end? Will jobs come back to their former level? We can hope, but I don’t know anyone who would bet on it at present. I know the situation in history the best, but similar, and perhaps worse, trends are playing out in most of the other humanities and soft social sciences.

It is now graduate-admissions season, and given these woeful numbers, the question is more pressing than ever: Is it practical — or ethical — to admit new advisees? Despite the bleak numbers, there is an argument to be made — and several of my colleagues make it quite forcefully — that it does make sense, at least at Princeton University, where I teach, and places like it. Princeton is wealthy, and it provides graduate students with a reasonable income: close to $50,000 a year, plus some subsidized housing and extra funds for research and conference travel and language study. Princeton graduate students don’t even have to work as teaching assistants to receive this stipend, which lasts for five years with the possibility of further extensions. During this period, they can read and study to their hearts’ content, learn languages, travel, receive guidance from brilliant colleagues of mine like Anthony Grafton and Linda Colley, and produce advanced scholarship of their own. There is a serious argument to be made that this is a valuable and fulfilling way to spend several years of one’s life, even if it doesn’t lead to a job in the professoriate. Every year when I speak to prospective graduate students, I emphasize the state of the job market and tell them that these intellectual satisfactions may be the principal thing they gain from graduate school. They often say this is a deal they are ready to accept.

Is it practical — or ethical — to admit new advisees? The problem is that the trade-offs are not always clearly visible to these prospective students. They generally enter a Ph.D. program at around age 24 and take at least six years to get the Ph.D. Afterward, it is not uncommon for them to spend at least three or four more years in postdocs and non-tenure-track appointments before deciding, if a tenure-track job has not come through, to pursue opportunities in a different field (assuming they do not decide to struggle on as adjuncts, earning a miserable wage without benefits). At that point, they are in their early to mid-30s and often find themselves competing for entry-level jobs in other fields with more-recent college graduates. When they start these jobs, they will be behind their peers in salary and retirement benefits, not to mention the sort of professional security and stability that makes it easier to start a family.

So what is to be done? I can’t accept that we should tolerate the situation as it is. One apparent solution often put forth is dramatically to cut the size of our graduate cohorts. Quite a few universities have already done this, in history as well as in other disciplines. There are two problems here, however. First, many public universities cannot cut the size of their cohorts very deeply, as they rely on graduate students for cheap labor, and the solution only makes sense if the overall number of graduate students in the country shrinks dramatically. Second, cutting cohort sizes means that graduate students will have fewer or no peers in their subfields, which not only damages intellectual community, but also makes it impossible for them to take specialized seminars in those subfields.

Another possible solution is introducing serious training for non-academic fields into our Ph.D. programs so that students can enter these fields more readily. The problem here is that these fields are numerous and diverse — they include publishing, academic administration, museum work, library work, high-school teaching, community-college teaching, foundation work, think-tank work, journalism, diplomacy, etc. — and history-department faculty generally have expertise in none of them. Yes, we could subsidize internships and bring in guest personnel to run programs. But this “alt-ac” preparation would also take serious time away from dissertation work, which could hurt our students’ chances of getting the few tenure-track jobs that do come up.

Here is my solution. It’s more radical, and probably a fantasy, since it would require the sort of large-scale structural change that our decentralized university system isn’t really capable of. Nevertheless, here it goes: Reduce the Ph.D. to four years — two years of course work plus two years of research and writing, with the goal of having two published articles, rather than a book-length manuscript at the end of it. Eliminate most stand-alone master’s programs, which would now be redundant (although not those programs which principally train K-12 history teachers). Eliminate teaching requirements for Ph.D. students and eliminate all current postdoctoral fellowships. At the same time, create a relatively small number of full-time, paid (and benefit-conferring) instructorships to cover the teaching that graduate students now do. People would hold these positions for a maximum of five years. At least at some institutions the funds currently used for postdocs could be repurposed to provide research and writing leave for them. At the end of five years, instructors could apply for jobs as assistant professors. The majority of doctoral students would leave the field after their four-year Ph.D. program, but at least they would not have invested anywhere near as much time in the process as graduate students do now. Those lucky enough to go on to instructorships would have a much greater chance of eventually landing assistant professorships and could move on to this tenure-track rank after a total of nine years, roughly equivalent to the time it currently takes for most graduate students to reach the same point.

Under this system, only the students who went on to instructorships would be able to take their dissertation projects in directions as challenging and ambitious as all graduate students in the field currently have the opportunity to do. That is a trade-off, and a serious one, that will possibly affect the overall character of scholarship in the field. I still think that, over all, it would be worth it, given how dysfunctional the current system has become.

This new system would probably work for wealthy private universities like Princeton. Would it work, financially, for public universities that currently rely on graduate-student labor? Conceivably it would, because these schools could reduce the number of fellowship-receiving Ph.D. students to the level necessary to balance their books. A more serious problem is that many universities have become addicted to the labor of adjuncts, who often cost them less than graduate students. Ideally, in this new system, instructors receiving benefits and a living wage would take over most of the teaching currently done by adjuncts. But would university administrations be willing to make this change? I have my doubts.

Again, the chances of anything like this new system being implemented are remote. But moments of crisis demand radical thinking, however unlikely, in the hope of generating ideas that may actually bring about helpful change.

Ajoy Ashirwad Mahaprashasta: ICHR Blocks Manuscript on Freedom Struggle Because It Makes the Sangh Look Bad, Alleges Historian

The Rubble

Infirm Logic: How the SIT report gave the Modi government a free pass on the 2002 violence