Thinking about the past thirty years can help readers get a sense of the scale of changes that are possible during the thirty years to come. In 1990, the Soviet Union and the Cold War were in place, computers were a niche business, and the world's leading advocates of deep neoliberalization, Ronald Reagan and Margaret Thatcher, were out of power. Not many years later, the Cold War had been replaced by a “Global War on Terror,” computers had morphed into a global techno-economic system with names like platform capitalism or surveillance capitalism, and neoliberalization not only outlived these dominant advocates but spread around the world. Simultaneously, work, including the work of university graduates, became increasingly precarious. Racial equality did not come to pass, nationally or globally, and yet even its prospect produced a rising backlash. The post–Cold War “peace dividend” was replaced by continuous local and regional wars. In the past thirty years, the United States in particular became post–middle class, post–civil rights, and postdemocratic. Future universities will need to confront all three legacies if they are to transform themselves.
This overview of the past thirty years is meant to be bleak and United States–centered—its role in the conference was to goad reflections on sharp changes in direction. The US research university and liberal arts college served as global gold standards throughout the post–World War II period, which allowed its weaknesses as well as strengths to shape the development of many national systems in both the Global North and the Global South. This prestige has begun to fade, but the system's ongoing stature in global rankings and other public arenas means that its features continue to have global implications. I'm going to list the core negative elements of the incumbent university structure of the United States.
A preliminary note: many commentators would likely explore one or the other of two important achievements. The first is the technological revolution associated with digital capitalism, which was largely funded by the government and invented, at least in its early stages, in universities. As wondrous as the science and engineering was and is, the gross underinvestment in matching social and cultural research—and in distributive social systems—has undermined its benefits and created other known problems that we are not yet handling well.
The second achievement is the racial diversification of higher education. In contrast to the 1960–90 attainment boom, the 1990–2020 period's growth, with its more than doubling of African American and Latinx degree attainment, ended the white near-monopoly over bachelor's degree share, reducing it from 85 to 63 percent. The shift is important, but in contrast to the previous thirty-year period, this attainment growth for students of color was done on the cheap. While the first period saw an increase in real-dollar per-student state funding from $5,000 to $8,000, by 2020 this metric was actually 6 percent below the $8,000 level where the second period began in 1990. Educational quality, relative to social needs, may have declined during the second period, although neither universities nor governments are tracking this.1 Both digital and underserved student attainment gains are real, but their impacts have been undermined or corrupted by the trends I'm going to identify. These trends have locked in a status quo that we're here calling the First Horizon—locked unless we realize the constraints were deliberately chosen and can be unchosen, too.
In 1990, we didn't yet call the Reagan-Thatcher realignment by our current term neoliberalism. Nor did it seem like this conservatism would succeed in reframing antiwar, civil rights, and feminist movements as threats to mainstream America. Reagan's Yanqui imperial revival in Latin America was destructive in a familiar Cold War way, but the Sandinista government stayed in place in Nicaragua, and the Iran-Contra inquiry in 1986 seemed to have encased US militarism in a legal framework that, like the Church Committee investigation into US intelligence in 1975, promised an expanded rule of law. The first Gulf War, launched by the first Bush president, generated domestic opposition that suggested to many a continuing popular interest in containing the Pentagon. The white backlash against busing to integrate US schools, the white middle-class tax revolts that started with Proposition 13 in California in 1978, and the deindustrialization and financialization that spread in the 1980s did not yet seem like permanent conditions. Some public universities, including the University of California, saw strong budget increases in the 1980s. Big Tech was about to become transformative but still appeared benign: in 1990, Intel was not quite twenty years old, Microsoft Windows was five, the Human Genome Project had just been founded, and Tim Berners-Lee had just invented the hypertext markup language, or HTML, and had just done his first demo of the World Wide Web.
Greater Santa Barbara, where the authors in this journal section met in early March 2020, has a geological twin in shallow-water petroleum deposits—the city of Baku, in the Caspian Sea. In January of 1990, the Soviet government sent twenty-six thousand troops to stop Baku's independence movement. The result, often called the January Massacre, and continuing Azerbaijani resistance, were unwatched heralds of the USSR's final year of existence. In short, 1990 did not signal a historical shift toward neoliberalism, and many signs pointed to multiple pathways toward a more open future.
But the next thirty years weakened all pathways other than the neoliberal path. This was not foreordained, but it did happen. I will summarize this as a series of continuing challenges.
The first of these major challenges has been the knowledge economy. Thirty years have brought two rounds of the tech-based “new economy,” one in the 1990s and another in the 2010s. Round 2 tech is more ominous than its predecessor, defined by justified fears of monopolization, mass disinformation and propaganda capabilities, and a pervasive gig economy. By 2020, its ethos could be fairly represented by titles like Cyber-Proletariat, Lurking, The Age of Surveillance Capitalism, and Dying for an iPhone.2 Contemporary capitalism's environmental damage is now widely acknowledged and causing ever-widening alarm. Even before counting the effects of the COVID-19 pandemic or its 2021 Delta variant, growth rates in high-income countries for both domestic product and productivity had declined (a 1990s productivity spurt was short-lived).3 Whatever the rate at which productivity rose, wages rose even more slowly, and the wage share of corporate income declined.4 The majority of the US population stopped getting real-dollar pay raises in the 1990s.5 Inequality exploded to the exclusive benefit of the top 20 percent of the population, with most income and wealth gains going to the top 1 percent and fractions of 1 percent.6 Economic experience has produced a cultural shift: much of the US citizenry no longer assumes that the country will offer good work to most people—stable work, at a living wage, with the chance of making a meaningful and interesting contribution to firm or society. COVID-19 has brought a new awareness of the category of essential workers and the inadequate pay of most of them, and yet it has increased the inequalities that point toward something we haven't seen before: the refeudalization of some capitalist democracies.
As we know, narratives do not simply reflect but produce real-world effects. This is certainly true with economic narratives. Recall that industrial capitalism had roles for both blue-collar workers, who often had high school degrees or less, and for white-collar workers, who often had two years and increasingly four years of university. Whatever hard-fought but limited parity blue-collar workers had achieved with white-collar workers as cohabitants of the US “middle-class” was economically eroded and discredited in the 1990s by leading policy thinkers. One of these was the policy entrepreneur Robert B. Reich, who would become Bill Clinton's secretary of labor in part because of his book The Work of Nations (1991). In that book, Reich, whose post-bachelor's credential was a degree from Yale Law, divided economic functions into three types: “routine production” work, which he declared obsolete in America by defining it, wrongly, as unskilled; next, “in-person servers” like nurses and college professors who needed to be on site but who were not adding much new economic value and would have mediocre wages and security; and finally, the “symbolic analysts,” the people who added nearly all value in a knowledge economy by working with numbers, words, and ideas. Although Reich personally advocated cross-class social solidarity, he helped instill the idea, at the top of the federal government, that the knowledge workers who formed the professional and managerial classes produced the only value that couldn't be produced better in a lower-wage country. This assumption appeared in the work of many experts and pundits of the 1980s and 1990s—from Michael Porter at Harvard Business School in The Competitive Advantage of Nations (1989), to New York Times columnist Thomas Friedman in bestsellers like The Lexus and the Olive Tree (1999), and on to Richard Florida in The Rise of the Creative Class (2002). The theorists of the knowledge class in this knowledge economy agreed that blue-collar workers did not really use knowledge and would therefore and necessarily lose their jobs to low- or medium-skill workforces in low-wage countries—or, a bit later, to automation. In short, conservative and liberal policymakers did not devote themselves to making sure everyone in the United States benefited from globalization. To the contrary, they defined globalization as naturally and logically leaving nonuniversity workers behind. Today's unhappy nonuniversity voter is right to perceive that for thirty years both major parties have treated them as economically disposable and that academics and journalists have supplied the rationales.
By 2000, it was becoming increasingly clear that white-collar college grads would start receiving the same treatment. Cutting against the Reich-Friedman paradigm, the authors of The Global Auction (2010) showed that the main educational strategy of Western countries was ineffective.7 Labeled the “neoliberal opportunity bargain,” it meant that the “the state's role was limited to creating opportunities for people through education to become marketable in the global competition.” It had only ever worked for a minority of graduates and was becoming less useful even for them.8
As Americans continued dutifully to increase their educational levels and their productivity, the rest of the world was doing exactly the same thing. While people paid to increase this productivity at the university level—human capital's bargain in which learning equals earning—they increasingly did not retain the value of this increased productivity. This had been true for industrial workers after the mid-1970s. By the financial crisis of 2008, it was visibly true for college graduates.
Note, first, there is a new global norm in which high skills continue to produce strong increases in productivity but do not produce matching increases in the overall wages of high-skill work. Second, there is the reversion of the United States and other high-income countries to the global mean. The younger populations of wealthy countries are losing their parents' exceptional status. Nations like the United States, France, and the United Kingdom are now normal countries with highly prosperous rentier classes and an abundance of low-wage jobs, many of which require high skill. In the period from 1990 to 2020, knowledge economies became gig economies. We may approve of this reversion on ethical grounds, but its price is being paid by younger and more diverse student generations.
Unfortunately, the university has not fought the shift toward the intellectual sweatshop but embodied it. Over the past thirty years, the US university actively moved from a two-to-one ratio of tenure-track to non-tenure-track professorial employment to the reverse ratio, in which the “new faculty majority” is contingent labor.9 Most college teaching has become a bit of a crap job with a permanently uncertain future, and years ago it began to generate the genres of “quit lit” and campus novels that don't star tenured deadwood but precarious adjunct faculty. It is not surprising, then, that in 2019, this combination of high skill and low wages made industrial action a fact of life on campuses across the country. Two of our conference participants were involved in a grading strike at the University of California, Santa Cruz, for a wage increase that would allow them to escape the rent burden caused by their submedian wages as teaching assistants. The pandemic has left all of these issues on the table, unresolved, even completely unaddressed.
If good work has been one major failure of Western societies after 1990, a second failure has been racial equality. This has been a dominant issue in the United States for decades—really since 1619—but it has been since 1990 that the implications of the postwar civil rights movements have become entirely clear to white Americans. Immigration increased by the end of Asian exclusion from national quotas in 1965, by the end of the Vietnam War ten years later, by the social devastation of United States–initiated wars and repression in Central America during the following ten years, and by the economic hardships in Mexico wrought by Bill Clinton's North American Free Trade Agreement ten years after that. Between 1970 and 1990, California's foreign-born population grew by over 150 percent to comprise 20 percent of state residents, and it rose to 25 percent by 2000. The highest-percentage growth came from Asia, but the Latinx share was much larger: by 2000, Latinx peoples were 43 percent of the California school-aged population. During these thirty years, California became a minority-majority state, closely followed by several other large states, and it is perhaps only one generation of migration ahead of European countries like the United Kingdom, France, Italy, Spain, and Germany. These first two issues—economics and race—continuously interact. The United States was faced with a policy choice, between egalitarian inclusion based on broad social infrastructural spending, and stratified resegregation. It chose the latter. Racial equity and integration require large social investments and systematic psychological transformation of the majority culture. Neither of these projects was seriously attempted, and de facto segregation has been our default.
Where did universities and their graduates stand on the question of white racial backlash? They were officially opposed, but practically aligned with its major effects. To take just one example, universities did not defend affirmative action policies as remedies for past discrimination much less as reparations. Throughout our period they accepted affirmative action as seeking diversity, which, though important, has never made the arguments for social justice that might explain to whites why ethics compels them to set their individual interests aside. White university graduates are at least as likely as non-college-educated whites to oppose race-conscious remedies. Most US universities teach their students that economic and racial inequality are the natural result of educational differences—of the difference between backward social groups who didn't attend university and forward-thinking ones who did. Until immigrant and domestic minorities become knowledge workers, in this view, inequality would both grow and remain deeply racialized.
Without downplaying the racism of white non-college-educated Americans, my point here is that something similar goes for a good proportion of college graduates. They don't so much use the language of direct denunciation as the language of structural preconditions and necessities. Narratives that justify unequal economic deserts as the legitimate results of privileged knowledge training are compatible with, and an expression of, racial capitalism. As the numbers of college graduates increased around the world after 1990, and less selective universities admitted larger proportions of people of color, the knowledge rewards have scrambled up the racialized selectivity ladder.
The third great challenge for Western societies is to convert national security into international governance and to end the waging of permanent local or regional wars. In the United States, we tend to trace militarism to the white working class—especially white southerners—and to gun culture. These are real sources, but in reality, the continuous use of force descends from the country's settler-colonial history and moved to the heart of US foreign policy as part of the “closing of the frontier” in the late nineteenth century.10 Permanent war policy was authored by intellectuals from the most prestigious US universities, which housed the country's most eminent authorities. The use of force to ensure global preeminence has been largely unchanged since the diplomat George Kennan stated the underlying principle in a classified cable. “We have about 50 percent of the world's wealth but only 6.3 percent of its population,” Kennan noted, as he defined the challenge of US foreign policy for the political structure as “maintain[ing] this position of disparity without positive detriment to our national security.”11 No subsequent president, four of the last six from Harvard or Yale, has sought peace at the price of greater economic parity with the global majority. With Trump, the country reached out to the University of Pennsylvania, and with Biden, into the academia incognita of the University of Delaware, but no break with this orthodoxy has yet been detected. To avoid belaboring this US quest for permanent superiority, I'll just point to Hillary Clinton, a liberal Democrat who was the secretary of state who supported regime change in Libya in 2011 and Honduras in 2009, leading to two refugee crises, across the Mediterranean and the US-Mexico border, of the type that typified the period of blowbacks from 1990 to 2020.
I will conclude by summarizing three key features of our current condition, our First Horizon. Economically, the knowledge economy has become an engine of inequality. One way I've put this is that the United States became “post–middle class,” and right around the time more people of color were entering its precincts of relative economic stability. Elites no longer assume that wealth creation depends on having a very large number of university graduates with excellent skills, or a large pool of middle-income people they need to treat well. In this context, “limited learning” in universities is a feature, not a bug, as the graduates of poorer colleges with fewer instructional resources have their entitlements to good jobs diluted in advance. We need to consider the possibility that an underfunded higher education system, enforcing a US version of the “low achievement trap,” appeals to many politicians because it supports the increase in economic inequality that defines the period from 1990 to 2020.12
Second, the years between 1990 and 2020 have seen the confirmation of a post–civil rights era, as Michael Omi and Howard Winant termed it in the late 1980s. Racial equality and real parity of educational and economic outcomes may be discussed but is never seriously pursued. Equality has been eclipsed by a series of controversies around affirmative action, immigration, critical race theory, and so on that divert the society from this primary goal by distorting and stigmatizing it. In spite of their formal commitment to diversity, equity, and inclusion, universities tend to sort white and now Asian American students into selective colleges and African American, Native, and Latinx students into open-access colleges, where they are less likely to graduate.13 Universities have not decolonized either their social relations or their knowledge production; greater awareness of these issues in the period from 1990 to 2020, through the tireless efforts of many activists and many academics, has left racial equality the proverbial dream deferred.
Third, the United States is postdemocratic both domestically and internationally. Colin Crouch developed the term around 2000 to refer to the situation in which “the forms of democracy remain fully in place,” and yet “politics and government are increasingly slipping back into the control of privileged elites in the manner characteristic of pre-democratic times.”14 Postdemocracy defines both foreign policy and our internal politics. We may think of this as microtargeted political messaging on platforms like Facebook, but it is broader and older than that: current empirical political science has confirmed Crouch's view that political and economic elites can now bypass majority opinion on tax policy, education spending, military invasions, job outsourcing, and the like.
So then, I am suggesting three elements of the First Horizon university to bring forward for intervention:
Universities can no longer promise economic security to the great majority of those who study there.
Higher education reproduces more than it reduces racialized economic inequalities.
Universities' governance structures are antidemocratic and help legitimate the managerial and antidemocratic aspects of state and national politics, including economic policy.
To pull out one of these for discussion, we knowledge workers of the world must democratize. It's a good note on which to say goodbye to 1990–2020: 2050 has no chance of escaping dystopia unless we can build an egalitarian US higher-education system by inventing nonhierarchical modes of collaboration that universities have never had.
For details see Newfield, “Budget Justice.”
Economic Policy Institute, “Productivity–Pay Gap”; for workers' share of corporate income, see Economic Policy Institute, “Nominal Wage Tracker,” chart 3.
On the former point, see Brown, Lauder, and Cheung, Death of Human Capital?; and Mehta and Newfield, “A Socialist Alternative?”