This week marks the end of the first quarter century of the 2000s. It’s a tidy chronological milestone but also an invitation to step back and ask a deceptively simple question: What, from the past 25 years, will actually matter to history?
Not to cable news or Twitter/X feeds or the daily churn of punditry—but to history in the deeper sense. What will a first-year survey course cover 50, 100, or even 500 years from now?
The answer is almost certainly far less than we think.
I still remember my first university course on Western civilization. We raced through Martin Luther, the Renaissance, the Glorious Revolution, the Enlightenment, the Industrial Revolution, and the two world wars. That was basically it. Entire lifetimes and continents collapsed into a handful of lectures. What felt all-consuming in one era barely registered in another.
So what of the first 25 years of this century will earn that kind of condensation? Which developments will survive the brutal triage of historical memory?
The century began with Y2K—a nothingburger in practical terms, but perhaps more significant symbolically. It revealed just how deeply the internet and computer code had become embedded in modern life. The panic itself was the point. It was an early signal that the digital world was no longer ancillary to human civilization but foundational to it. In retrospect, Y2K looks less like a false alarm and more like a foreshadowing—an early tremor before the rise of social media, smartphones, and now generative artificial intelligence and the plausible prospect of artificial general intelligence. That arc, unlike Y2K itself, seems likely to command long-run attention.
Then came September 11, 2001. I was in my first days of university when the twin towers collapsed. Like most people of my generation, I assumed I was witnessing a hinge moment in history—something akin to the assassination of Archduke Franz Ferdinand or Pearl Harbor. And in the short run, it certainly was. It reshaped foreign policy, domestic security, and an entire generation’s psychology.
Yet a quarter century later, its historical significance feels less clear. The ensuing conflict against radical Islam and non-state terrorism proved far more protracted, ambiguous, and less existential than originally envisioned. Al-Qaeda was degraded, ISIS rose and fell, and the West muddled through. Civilization didn’t fracture.
View reader comments (3)
But perhaps the deeper historical importance of 9/11 lies not in the attacks themselves, but in the overreach that followed. The wars in Afghanistan and Iraq—and their costs, misjudgments, and frustrations—appear to have catalyzed a growing skepticism within the United States about unipolarity, global leadership, and the burdens of hegemony.
The throughline from the post-9/11 consensus to the eventual rise of Donald Trump isn’t especially hard to trace. If future historians linger anywhere, it may be on how America’s response to 9/11 planted the seeds of a domestic revolt against the post–Cold War order.
Trump himself—along with the broader rise of populism and reactionary politics—does feel historically significant. Assuming he leaves office at the end of his term without a fight, his importance is disjunctive rather than constructive. He didn’t inaugurate a new paradigmatic settlement so much as signal the end of an old one. The era of post–Cold War globalization and neoliberal consensus effectively died on his watch. What replaces it remains unresolved. That uncertainty may be the defining political feature of the next quarter century.
Beyond immediate headlines, what development from the last 25 years does the author suggest might have the most profound, long-term historical impact?
How does the article link America's response to 9/11 to the rise of Donald Trump and populism?
What unexpected demographic trend does the author identify as a potential 'civilizational turning point' for developed nations?
Comments (3)
Very thought provoking and insightful. As typical with Sean Speers’s work, impeccably well written.