Al Swearengen, the foul-mouthed saloon keeper in HBO’s Deadwood, shares a nugget of wisdom in a rare moment of civility: “Announcing your plans is a good way to hear God laugh.” The line riffs on a Yiddish proverb, “Mann tracht un Gott lacht,” or “Man plans and God laughs.”
The principle seems apt not only for our times, but also for the calendar I’ve developed for my 2025 series. It made sense in December, but so much has changed since then in our country and for me personally that other questions feel more pressing. So I’m breaking from my usual schedule this week, perhaps for the foreseeable future. I want to write from the heart, not from a book map that no longer has blood flowing to it. When it feels right to return to fatherhood and to craft, I’ll do so.
This week I’m turning to two op-eds in the New York Times that caught my eye, not only for their predictions about the future of higher ed, but also for what they gloss over in what led us to the current moment.
The first is the Editorial Board’s “The Authoritarian Endgame on Higher Education.”
The authors note that totalitarian leaders not only quash the free press and checks/balances within government, but also seek to weaken universities, which have historically acted as independent sanctuaries for truth seekers who feel empowered, within the protective walls of their institutions, to challenge propaganda and insist on evidence-based claims about everything from history to public health.
Context, evidence, and nuance matter in a healthy university. Questions are more interesting than definitive answers. Authoritative science is only recognized as such if it is presented in a way that is falsifiable. If a claim is not structured in such a way that it can be proven wrong by better evidence, it cannot be accepted as a reliable claim. This is also true of humanities research, albeit with texts rather than the material world: every argument is made in good faith with the best evidence available, but also with the understanding that reasonable scholars might disagree, and that no text is ever reducible to a single reading.
Certainty and simplicity, staples of political campaigning, are anathema to intellectual inquiry. As I often illustrated in my American literature survey, no candidate for public office who has invoked the metaphor of America as a “city upon a hill” and a light to the world has ever (to my knowledge) acknowledged that this phrase originated as a message for Puritan New England long before there was any conception of America or the United States. The city that John Winthrop imagined was a tightly governed theocracy with a rigid class system, not a sanctuary for the tired and poor yearning to breathe free, not a democracy, not an engine for economic opportunity and mobility. This observation leads to no obvious conclusion, but raises fair and reasonable questions about just what Ronald Reagan meant when he alluded to the “shining City on a Hill,” whether John McCain meant the same thing when he revived the phrase, and whether Kamala Harris invoked Winthrop’s metaphorical city to draw a contrast between her vision and others or whether her writers borrowed the phrase as a cynical ploy for crossover voters.
In a healthy society, questions like these hold public leaders accountable for using language carefully and truthfully. In the classroom, such questions are cornerstones of critical thinking and persuasive writing. But none of that matters without an intact and durable university system.
The Editorial Board acknowledges that universities have strayed from truth seeking and that any meaningful resistance to the current attacks on higher ed must begin with a restored commitment to intellectual integrity:
Too many professors and university administrators acted in recent years as liberal ideologues rather than seekers of empirical truth. Academics have tried to silence debate on legitimate questions, including about Covid lockdowns, gender transition treatments and diversity, equity and inclusion. A Harvard University survey last year found that only 33 percent of graduating seniors felt comfortable expressing their opinions about controversial topics, with moderate and conservative students being the most worried about ostracization.
These examples skim the surface of how universities have sold their intellectual birthrights for bowls of corporate stew and exchanged scholarship for something closer to idolatry. I’ll return to these themes presently.
The second op-ed that caught my eye is Meghan O’Rourke’s “The End of the University As We Know It.”
O’Rourke is an English professor at Yale University, author of five books, and editor of The Yale Review. She paints a grimmer picture from inside an elite university, where even a $41.4 billion endowment cannot prevent the specter of budget cuts. O’Rourke fears losing a program she runs for students seeking careers in editing and foresees a future in which universities comply with Trump’s demands to avoid losing federal funding.
Whereas the Times Editorial Board focuses on damage to science and technology research, O’Rourke makes a case for the humanities as a form of truth seeking. Twentieth-century literature, in particular, reveals that scientific innovation, untethered from any other structure of ethics or meaning, is morally bereft — capable in its worst forms of simply producing unprecedented destruction. The STEM obsession that has plagued K-12 and higher ed since the 1990s, if not longer, requires historical blindness, as if we learned nothing from the Industrial Revolution, two world wars, and the ongoing threat of nuclear annihilation. This is, to me, the fundamental takeaway from Oppenheimer — not the incredible power that the greatest scientific minds can produce in a short time, but the incredible harm that can come from worshiping such power with no thought to its human implications.
The humanities as a discipline has more ancient roots, but O’Rourke is right to emphasize the impossibility of simply returning to the ill-defined “classical” approach favored by conservatives like Christopher Rufo, as if we could just go back to reading Paradise Lost the way it was taught in the 1950s and ignore the explosion of ethnic American literature in the last two hundred years. When I began my doctoral studies in 2001, the discourse was more conciliatory: an expanded canon raised questions about what to prioritize on a syllabus, but there wasn’t a prevailing feeling that the entire Western tradition could be dismissed as racist or that the only way to teach older texts was by focusing on their moral and cultural failings by contemporary standards. In fact, a great many of the Native American, African-American, and Chicano works that I loved to teach include references to Shakespeare, Milton, and the Bible. The unending conversation in those works is still intact.
O’Rourke offers no blueprint for resistance, really, other than a renewed commitment to touting the public goods that universities offer. This is as close as she comes to a thesis:
If the university has always been politicized one way or another, why should conservatives care about protecting the intellectual freedom currently housed in what are predominantly liberal institutions? The answer is earnest and aspirational: because the serious, reflective work of scholarship benefits us all. Because academic freedom makes it possible to critique institutionality from within at a time when institutions rule our lives. Because it permits intellectuals and scientists to question realities we have become complacent about. Because it creates space for values that live outside the capitalist marketplace. Because it houses art and artists. Yes, the university can be, like any community anywhere, divisive, censorious, sometimes too ideologically homogeneous. But when it works, it trains people to think critically, powerfully and unflinchingly.
I agree with much of this reasoning. Yet much of it sounds as nostalgic as the conservative view of a classical Golden Age. O’Rourke’s picture was still true of the American university during the 1990s. Remnants of it survived during my early years as a professor. I can see how Yale might have been shielded from the intellectual erosion that other less elite schools have been experiencing for years. But the kind of existential panic that Yale is just now experiencing, and O’Rourke’s conviction that the very conditions for free thought are under attack, are recent manifestations of the intellectual erosion that has plagued many other colleges and universities for much longer.
I agree that the damage of the current attacks will be felt for decades — likely throughout my children’s lifetimes, perhaps throughout their children’s. But let’s not deceive ourselves about how we got here or mistake the authoritarian endgame for the original cause of higher ed’s woes. Trump and Musk are like vipers who have slithered into an influenza ward where every patient is suffering gravely, and many are already at death’s door.
Here, in my humble opinion, are some of the changes in American colleges and universities throughout my lifetime that have weakened their intellectual integrity long before Trump, Musk, Rufo, and Vance began trying to finish them off.
The corporatization of higher education is a betrayal of its mission. It would take a better historian than I to pinpoint a starting point for colleges and universities replacing their core missions as nonprofits with corporate imperatives, but this process is at least two decades old. As nonprofits, universities prize intellectual inquiry, cultural richness, and truth seeking above all else. The budget makes these goals possible, not the other way around. As corporations, universities prize hierarchy, branding, and transactional outcomes that increase market share. Every department and program must justify itself by the budget, not the other way around. This is how it came to be that a Nobel Laureate subordinated his academic legacy to a football coach’s personal brand in a television advertisement.
The demise of peer review degrades the entire research enterprise. Scholarship has never been a perfect system for creating knowledge, but blind peer review has long been a linchpin for reliable information. One might say that even if peer review is vulnerable to social bias, it is preferable to any other system available. The idea is that two experts review new research without knowing anything about the author and judge it to be persuasive or not, sometimes offering revision suggestions to bolster incomplete findings. If both experts and the journal’s editor agree, the work is published and becomes part of the body of knowledge in its field. Any new scholarship must orient itself in relation to that body of knowledge and correct it, if necessary. This is the primary form of research in the humanities and many other disciplines. But peer review is largely pro bono work, which has increasingly raised questions about how equally that labor is shared among faculty, and as a result, who is willing to accept that uncompensated labor. Since the pandemic, participation in peer review has precipitously declined, leading to serious delays in publication (by which time some evidence might be outdated) and lower overall quality in the pool of reviewers. As I wrote last fall, sometimes works are published in peer-reviewed venues without any evidence that they have actually been reviewed. Peer review is fundamental to truth seeking in a university, and scholarship has been compromised in this area for at least five years.
The demise of tenure is the demise of academic freedom and academic quality. Neither of the op-eds above mention this, but the decline of tenure-track faculty has steadily weakened the integrity of higher education for at least two decades. Tenure is a lifetime contract given to faculty who meet the standards for research, teaching, and service defined by their institutions. The purpose of it is to protect academic freedom, to ensure that no scholar is punished or fired for free speech. But one result of corporatized higher ed has been a steady replacement of tenure-track hires with adjuncts, instructors, or lecturers who work for lower pay, often without benefits, job security, or protections for academic freedom. The percentages vary by institution, but more than 70% of faculty at American universities are now adjuncts, instructors, or lecturers. Less than 30% are tenured or tenure-track faculty. The workloads borne by contingent faculty leave little time for research or intellectual development beyond immediate teaching responsibilities. While many in this cadre are passionate and devoted teachers, the potential for burnout is severe and the impact on quality of education is inevitable.
Standardization, alignment, and assessment are killing innovation. Another outcome of the corporatized university is the belief that teaching and learning can be subjected to the same process of continuous improvement that businesses use to increase productivity and efficiency. My department was asked to write an “operational plan” as early as 2005, which we did dutifully, but that task was understood to be more for our benefit than for any external audience. In recent years, words like “standardization” and “alignment” have carried increasingly greater weight, requiring enormous labor to overhaul curricula with these goals in mind, and more labor to evaluate student work according to assessment metrics for outside accreditation. It’s often hard to find linear time to read with that kind of workload, much less conduct original research or explore innovative teaching strategies. A healthy university recognizes that good teaching plants seeds that often can’t be measured for years, that the best teachers approach their craft with humility and mystery, that students achieve mastery at different paces and sometimes in different sequences of courses, and that experiences such as epiphany (which lead to the most enduring discoveries) cannot be measured by an assessment metric. A diseased university discourages teachers from taking risks or thinking outside the rubrics and turns them into drudges like the beleaguered employees dutifully completing their TPS reports in Office Space. Standardization, alignment, and assessment are subtle strangleholds on academic freedom because their primary purpose is to communicate ROI to prospective students and parents, not to enhance intellectual inquiry.
The higher the cost, the higher the risk of free exploration. The causes for the ballooning cost of a college degree are legion and too numerous to list here. But the fact is that a degree now costs roughly as much as a home does. It’s a monumental financial decision that brings all kinds of pressure to know exactly what you’ll study and what you’ll do with that degree afterward. Hence the urgency for institutions to demonstrate ROI and to ensure that every course builds some kind of measurable competency. The arts and humanities have tremendous value for character building, much of which manifests in unpredictable ways that are relative to each individual. It’s not as simple as abstracting “communication skills” from the degree; it’s things like discernment, authority, the ability to synthesize different sources of information, and creativity (which cannot be scripted) that set the humanities graduate apart. But because it’s difficult for humanities departments to say that their majors build particular skills which plug reliably into particular jobs, arts and humanities majors seem risky; nearly every other STEM field seems like a more responsible choice. The unfortunate result of this risk aversion is that it undermines the purpose of a core curriculum, that set of courses that every student must complete to graduate. Core requirements give students opportunities to explore fields they would never otherwise have tried. When the cost of college was lower, this often meant you could discover a passion you didn’t know you had and change your major without feeling like you were throwing tens of thousands of dollars away or rolling a giant set of dice. When high cost breeds fear about making the right choice, few students end up exploring a college education freely. This not only limits students’ self-discovery and personal growth, it means that the university system is nearly as far from truth seeking as it could be. Instead, it’s a process of hedging bets, hoping for an immediate payoff, then hoping to avoid layoffs long enough to pay down debt.
There are other factors at work, but these are some of the reasons why saving higher ed is more complicated than these two opinion pieces make it seem. The fractures run much deeper and have been deepening throughout my lifetime.
These are questions I’ve asked in different forms before, but I wonder if any of you are thinking differently about them now, given what you’re reading and perhaps witnessing, if you still work in higher ed:
Is there still time to fortify higher education in America and restore it as a public good? If so, how?
If decades-long damage is inevitable, what are the implications for our children and for their children? How do we best prepare them for those realities?
Today, at the partner site,
, I’m sharing a post from the archives about how universities came to be viewed by Americans as both a public good and a public threat.
Great post! Another consideration (IMO) is that the Reagan administration’s federal tax and budget cuts, which were enforced on the states (far less federal money going to them), resulted in them cutting their own costs since they can’t print money. These state cuts hit public universities hard (and are ongoing). It took the universities a long time to figure out how to deal with their funding crisis, but many of the changes you identify are directly traceable to the solutions they settled on: corporatization, turning campuses into luxury communities to attract wealthier students; heavier reliance on big-ticket sports; increased dependence on research funding from the feds; unpaid peer review work; the transition from largely tenured (and expensive) faculty to less well-paid part-time untenured faculty; standardization with the aim of making assessment less expensive. These changes go farther back than just a couple of decades, but they have accelerated.
I started grad school in 1997. At that time, there were already students with a vulgar-Foucaultian point of view that there is no such thing as truth, only narratives or power. They viewed themselves as agents for change, and they believed that any invocation of truth served the status quo (capitalism, colonialism, etc.). No one was considering what would happen if the vulgar-Foucaultian view became generalized and people outside the academy, including conservatives, adopted it.