This article is the second in a series on how President Donald Trump changed history—reviving historical debates that have simmered on low heat for years, and altering how historians think about them. See the first in the series here.
Americans have always thought their country was exceptional. They thought it even as early as 1630, when John Winthrop delivered a now-famous sermon in which he called the Puritan community a “city on a hill”—long before there even was an American country.
In more recent years, the idea of American exceptionalism has become tainted by politics—a rhetorical cudgel that politicians, particularly conservatives, wield to bludgeon their opponents. During President Barack Obama’s tenure, Republican leaders expressed concern that, in Newt Gingrich’s words, there was “a determined group of radicals in the United States who outright oppose American Exceptionalism.” Mitt Romney claimed that Obama didn’t “have the same feeling about American exceptionalism that we do.” Former New York Mayor Rudolph Giuliani went a step further. “I do not believe that the president loves America,” he declared. Unlike his predecessors, Obama didn’t seem to appreciate “what an exceptional country we are.” Obama ultimately felt compelled to correct the record. On July 4, 2012, he paid tribute to a group of newly naturalized citizens, celebrating their diversity and service to country as “one of the reasons that America is exceptional.”
It’s unusual that the Republican Party’s most recent standard bearer, President Donald Trump, has disavowed the very idea of “American exceptionalism.” “I don’t think it’s a very nice term,” he said. “I think you’re insulting the world.” But that doesn’t mean that Trump has chucked this dearly held principle. When most conservative politicians invoke the term “exceptionalism” they use it as shorthand for raw national chauvinism—the assertion that the United States is not just different, but better. Trump has replaced it, at least temporarily, with an angrier tag line that conveys the same sense of national power and entitlement—America First, itself a term ripped from history and freighted with dark meaning. When America is first, it owes little to everyone else. It’s a more Trumpian way of saying what other politicians often mean.
When they use the term “exceptional” to connote pure superiority, though, politicians generally betray a facile grasp of history. In its original formation, American exceptionalism was a much more complicated theory. It conveyed the idea that the United States was immune from social, political and economic forces that governed other countries—specifically, that it was invulnerable to communism and fascism, and to violent political convulsions of the sort that jolted Europe throughout the long 19th and 20th centuries. It also implied that Americans bore a providential obligation to be exemplars of virtue in a sinful world.
Exceptionalism was for many decades a hotly-contested topic among historians and social scientists. Could arbitrary borders really render an entire country exempt from broader social, economic and political forces, particularly in an age when these borders became more porous to the movement of capital and labor? Or did patterns of political development in fact create unique forms of national “character?”
In more recent years, the debate cooled. While some political scientists continued to explore potential variants of American exceptionalism, most historians concluded that the idea was meaningless and the very conversation itself stale.
Then came Trump.
His election and the conditions that accompanied it—a growing rejection of science and evidentiary fact, extreme political tribalism, the rise of conservative nationalist movements around the world, a popular reaction to immigration and free trade—may offer final and conclusive proof that there is nothing at all exceptional about the United States. We are fully susceptible to the same forces, good and ill, that drive politics around the globe.
But before we sound a death knell for the idea, it would help to remember what it actually means.
Ironically, the phrase that so many conservatives traditionally claimed as their own first entered the popular lexicon in 1929, when Joseph Stalin censored what he called the “heresy of American exceptionalism.” At issue was the insistence of American communists, under the leadership of Jay Lovestone, that their country’s economy was developing on a timeline divergent from that of Europe, thus necessitating an intermediary period before outright revolution.
This seemingly benign idea set American communists on a collision course with Moscow. Marxist orthodoxy held that the laws of political economy were universal and immutable. History—in this case, the various stages of capitalist development—operated the same way everywhere. No one country was immune to its universal principles. The Soviet leadership purged Lovestone and his supporters and replaced them with more conforming enthusiasts.
If the phrase was new, the underlying sentiment—that America was somehow different, or special—was not. Since the earliest days of colonial settlement, the Puritan settler John Winthrop conceived of America as a “city on a hill,” a distinct place with a heaven-sent obligation to build a new and pure world. In the aftermath of the War for Independence, many citizens agreed with William Findley, a farmer from western Pennsylvania who would later serve in Congress, that Americans had “formed a character peculiar to themselves, and in some respects distinct from that of other nations.” During his travels across the young country in 1831 and 1832, the French statesman Alexis de Tocqueville concluded that “the positon of America is therefore quite exceptional,” for indeed, many of the people whom he consulted believed, in the words of one of his Boston informants, that “there are no precedents for our history.”
There were two intertwining strains of this popular but still nameless idea. The first was rooted in Puritan theology and held that America was divinely selected for greatness and mission. The second presumed that the United States was unique in the character of its people, economy and politics.
With the advent of modern universities in the late 19th century, scholars attempted to explain this historical uniqueness. Historian Frederick Jackson Turner offered the most lasting theory in 1893, with his essay, “The Significance of the Frontier in American History.” By Turner’s reckoning, the American frontier had been a great leveler where pioneers shed all semblance of status and heritage and worked together to tame a fierce wilderness. It was a cauldron of democracy—the “gate of escape from the bondage of the past.”
Turner’s thesis created more despair than pride. While the country was in many ways brimming with confidence—“the “greatest destiny the world ever knew was ours,” said future Secretary of State John Hay in a typical display of national chauvinism—alongside this spirit of optimism ran a sense of parallel dread. In 1890 the U.S. census declared the western frontier officially “closed.” If the frontier had been the greatest source of America’s unique political economy, now, with no more territory to conquer and civilize, the country’s raw ambitions and talent would surely dissipate.
The essay struck a resonant chord, for already many people saw signs of weakness in the culture. As more men worked as clerks and professionals, they let atrophy the muscle, brawn and sheer courage that it took to break the land. Even manual laborers were now more likely to be employees of other men, rather than self-sufficient yeoman farmers or shop owners, whom earlier generations of Americans regarded as the foundation of the republic. Worse still, millions of new immigrants from southern and eastern Europe threatened to dilute the heritage that many old-stock Americans viewed as central to the nation’s past success. The United States was losing its edge, or so people worried. It would not be the last time that concerns about “exceptionalism” ran parallel to doubts about the country’s future prospects.
The frontier thesis was just one of many attempts to identify and explain points of distinction. Many historians and political scientists sought to answer a question that Werner Sombart, the German sociologist, posed in 1906: “Why is there no Socialism in America?” It seemed to take root everywhere else, but not in the United States. Was it America’s ethnic and linguistic diversity? The inability to forge a working-class identity that transcended race? A surfeit of open land to absorb the working class? These possibilities animated social scientists then, and to a large degree, even now.
Though exceptionalism claimed deep roots in American tradition, the concept truly came into its own in the 1950s, as a generation of historians, writing in the wake of World War II, sought to make sense of why their country, alone, had escaped the violent disruptions that beset Europe over the previous 150 years: revolution, regicide, class uprising, total war and genocide. None of it had happened in the United States, these historians noted. They conveniently glossed over the violently repressive regimes of chattel slavery, Redemption, war on Indian nations, and Jim Crow, which, of course, most historians writing in these years blithely did.
Some, like David Potter, believed that material abundance freed America from the economics of scarcity that drove the trajectory of modern political development in Europe. It created an environment in which America’s “distinctive national character” could take root and grow. Others, like Louis Hartz, proposed that the absence of a feudal heritage set Americans on a uniquely tranquil path. Drawing on Tocqueville—who observed more than a century earlier that the “great advantage of Americans is that they have arrived at a state of democracy without having to endure a democratic revolution; and that they are born equal, instead of becoming so” —Hartz posited a longstanding “liberal tradition” common to the vast majority of Americans. His theory fit neatly inside the mid-century “consensus” school of history, which held that despite tactical differences over economic and political questions, most political actors in American history—Federalists and Republican-Democrats, Whigs and Democrats, Democrats and Republicans—shared a broad common belief system that rejected doctrinal extremes of the far right and far left.
The common thread binding these diverse interpretations was a belief that the rest of the world (and certainly Europe)—but not the United States—operated according to a singular collection of economic and historical rules.
The idea of American exceptionalism fell into intellectual disrepute by the early 1970s. After a decade of political turbulence over civil rights and Vietnam—police violence against peaceful black protesters, urban riots, political assassinations, pitched anti-war protests, acts of political terrorism by left-wing extremists—it seemed hardly certain that the country was any less fractured and unstable than its European cousins. Exceptionalism no longer seemed like a sure thing.
It wasn’t just external events that inspired a reconsideration. As successive generations of scholars grew interested in such phenomena as transnational immigration and borderlands conflict—not just between Mexico and the U.S., but between nation states far and wide—the clearer it has become that American history is inextricable from global events. The field of comparative history also inspired researchers to recognize that America may be different from other countries, but it is not governed by a unique set of rules regarding state-building, economics, religion, ideology and even socialism. Nor are all other countries driven by the exact same global forces. All countries experience some degree of differentiated political and economic development, and as Princeton University historian Daniel Rodgers observed, to ask, “Is America different?” is no more instructive a question than “‘Is Argentina different?’ Or Afghanistan.”
In short, many historians today argue that it was always folly to assume that invisible forces drove all of global history, but greater folly, still, to assert that those forces were somehow inoperable within America’s ever-changing borders and among its perpetually evolving citizenry.
If you still need convincing, look no further than recent events. It’s become axiomatic that Trump’s victory was of a piece with populist insurgencies as far and wide as France and the Netherlands, Britain and Greece. Driven by backlash against open borders, free trade and common markets, they share an angry ethno-nationalism, a rejection of elite institutions and actors, and a wispy nostalgia for an imaginary past. These movements reject sexual and cultural pluralism and, in some cases, claim a common patron in Moscow.
Nor is it clear that America’s democratic norms and institutions are fundamentally any more rock solid than those of, say, Poland, Hungary or Venezuela. As bad as? No. But any more unassailable? To observe Republican legislators in North Carolina attempt to abrogate the results of a gubernatorial election—to say nothing of Senators Paul Ryan and Mitch McConnell, who have all but abandoned the legislative committee process in their feverish effort to cram the courts with conservative judges and pass deeply unpopular class legislation by cover of night—one can’t help but worry that the United States has slipped into what sociologist Larry Diamond calls a “democratic recession.” It’s a global, not a uniquely American, phenomenon.
If we’ve learned anything in recent months, it’s that the United States isn’t immune to the forces of history. It is, on the contrary, swept up in them. That fact set seems the very repudiation of American Exceptionalism, if we’re to use the term correctly.
Or is it?
In a slim but prescient volume published over 10 years ago, historian Eric Rauchway examined an earlier era in which “globalization”—the worldwide movement of capital, labor, information and ideas—generated political and cultural populist backlash in the United States. From the mid-19th century through World War I, America absorbed tens of millions of immigrants, trillions of dollars in foreign investment capital (in current-day money), and launched a massive colonization program on par with that of European nations like Russia, England and France, though in the case of the United States, colonization took the form of western expansion on American soil. We were, in effect, deeply caught up in global currents.
By his own admission, Rauchway neither “cheers nor jeers” for the concept of American Exceptionalism (“nor am I even especially interested in it,” he continued). Instead, he wanted to explain “discernible degrees of difference in the impact of world systems and the extent to which they appear to have mattered in American national development.” In this endeavor, he did detect much about the United States that was different.
Whereas other countries raised large armies and state bureaucracies to subdue and govern colonial land and people, the United States didn’t have to, so didn’t. These other world powers drained their treasuries to fund increasingly influential state institutions that worked to draw indigenous populations into a permeant but subordinate relationship; the United States, by contrast, maintained a small central government and standing army. This difference carried political consequences.
Whereas other central states funded the lion’s share of infrastructure and capitalist development, America enjoyed such abundant access to foreign capital that its railroads, telegraph systems, extractive industries and agricultural industry grew up around private investment. All of these points of differentiation contributed to a very different pattern of political development.
Working men and women who faced labor competition from new immigrants migrated en masse to destinations west—before 1900, usually to new farms; after, to cities where they found factory or service employment. Roughly 20 percent of native-born Americans were picking up stakes and moving each year, according to the Census, and not always willingly. In an age of growing wealth and economic inequality, many such native-born Americans grew to resent immigrants, whom they blamed for their condition. But they also nursed intense hatred toward banks, railroads, grain operators, mine owners and financial elites, who (they believed) kept them in a state of economic privation and dependency.
The result was a particular brand of American populism (which I’ll explore in greater depth in the third article in this series). It was often viciously nativist. It was anti-statist; unlike socialists in Europe, political radicals in the United States tended to embrace punitive regulatory policies that would rein in large corporations, rather than large-scale social welfare policies that extended government health care and pensions to working people.
If all of this sounds eerily familiar, it should. Substitute Mexican immigrants for Europeans, Facebook and Google for Standard Oil and Union Pacific, J.P. Morgan Chase for … well, J.P. Morgan & Co. and Chase National Bank, and the story is similar.
As was the case one hundred years ago, America today is very much part—if not at the center—of history. It isn’t immune to the same currents of immigration, free trade, population aging and technological change that are upending political and economic systems around the globe. To believe that we’re “exceptional,” in the way that historians understand the term, is to reject reality for national theology.
But if we’re not exceptional, we might still be different, and the key to understanding how we make it out on the other side of any political storm is probing the strengths and vulnerabilities that flow from this difference. Tocqueville’s Boston informer posited that “there are no precedents for our history.” He was wrong. The trick is knowing which ones to look for.