Viewpoint

Joanna Baron: Blame Big Tech all you like, but polarization is in our nature

A year of lockdowns has been a cognitive disaster, pushing our discourse ever further into irrationality
Twitter CEO Jack Dorsey speaks during a hearing on Capitol Hill. Greg Nash/AP photo

Can we blame Big Tech social media platforms for increasing polarization and the seeming dearth of civil discourse?

Or is the problem a deeper quandary concerning postmodern identity politics and base human nature, one which has been highlighted by a year and counting of lockdowns?

Last week, Facebook VP Nick Clegg wrote a long Medium article defending the company against charges that it encourages polarization. This theory Clegg, a former UK politician and highly competent hired gun, was defending Facebook against goes something like this:

Ruled by robots programmed to keep our attention as long as possible, social media algorithms promote stuff we are likely to double-tap on or share — and bury everything else. The effect is that people exclusively read and shape their worldviews via articles that confirm their priors. They meet and converse with others who are similar. Eventually, some of them end up at the U.S. Capitol building with guns.

This narrative was developed at length in Shoshanna Zubkoff’s Surveillance Capitalism, as well as the popular Netflix documentary the Social Dilemma.

Clegg countered the “filter bubbles” narrative that has become dominant concerning the effects of social media on polarization. He cited independent academic studies that undercut the idea that the network encourages us to retreat into cocoons of familiar information.

He also laid out Facebook’s new plans to allow users control over their algorithm:

“People should be able to better understand how the ranking algorithms work and why they make particular decisions, and they should have more control over the content that is shown to them,” Clegg wrote. “You should be able to talk back to the algorithm and consciously adjust or ignore the predictions it makes — to alter your personal algorithm in the cold light of day, through breathing spaces built into the design of the platform.”

Indeed, independent studies confirm that the most polarized demographic groups are the least “extremely online.” They are usually boomers and above, people who report getting their news mostly from cable television.

We are motivated not by truth but by emotive stories

Clegg contends that human nature itself is the problem. “Consider, for example, the presence of bad and polarizing content on private messaging apps — iMessage, Signal, Telegram, WhatsApp — used by billions of people around the world,” he writes.

Clegg’s apologia for Facebook points towards a much bigger problem that moving all societal discourse onto digital platforms has occasioned and for which the train has already left the station: we are motivated not by truth, but by emotive stories.

The problem is not with the Facebook or Twitter algorithm, but with human nature, and the quirks of our primal wetware. To wit: it’s already possible on both platforms, with a few clicks, to reset your feed to rank posts chronologically versus algorithmically, but who wants to do that when being fed piping-hot outrage porn is so much more satisfying? Big tech platforms essentially serve as an etheric parrot that collects information on our impulses and mirrors back the content we find tantalizing.

In the secular age, there is no central mediating authority or reservoir of meaning. Everyone is fundamentally a solipsist, the main character of their life’s movie. This epistemological fact has become elevated to a moral imperative and burnished through the rhetoric of identity. But in 2020, this solipsism expanded into the algorithmic architecture of all our social interactions.

The default in cultural and political discourse is the rhetoric of identity: who you are, which group you can profess to speak for, and your subjective experience. The problem is that we have not only begun to acknowledge our partiality, and the partiality of others, we have also begun to revere it, and this is a mistake.

In so doing, we ironically evince a fundamental post-Protestant moral puritanism — as Mark Lilla recently wrote, “The uptight Bible-thumping humbug of yore has been shamed off the public square— but only to make room for networks for self-righteous beautiful souls pronouncing sentence from the cathedra of their inner Vaticans.”

Discourse happens in our digital public squares, which have become increasingly irrational and seemingly disconnected from any earnest desire to engage in speech for the sake of the pursuit of truth. The cognitive disaster of a year and counting’s worth of lockdown is that it’s not clear how it’s even possible to disrupt our filter bubbles or widen one’s horizons. It’s as though navel-gazing was transposed onto digital fences that monopolize our line of vision.

So I spent much of this past year watching as people in my orbit demonstrated their consistent inability to peer out of their own digital channels. Sloppy diatribes, vicious personal attacks, rank partisanship. It is paradoxical that while the effect of social media has been to eliminate barriers to entry into public discourse, it has simultaneously withered down the Overton window to a sliver, enforced by the digital mob.

In pre-pandemic times, of course, the potential for disruption existed by interfacing with physical reality — whether at the ‘water cooler’ or its analogs, or even just driving through parts of one’s city where the reality is at odds with your own. Since March 2020, the only funnel available to form our views of the world has been mediated through a digital filter.

In 21 Lessons for the 21st Century, Yuval Noah Harari points out the somewhat paradoxical reality that information overwhelm has made our fundamental irrationality more glaring: humans think in stories rather than facts, numbers, or equations. The simpler, and the more emotive the story, the more persuasive it is to the human brain: “Homo sapiens is a post-truth species, whose power depends on creating and believing fictions.”

How this looks on a daily basis on the timeline is that nuance, reasonable disagreement and complexity became eclipsed by mutual perception of one’s political opponent’s villainy. To my friends on the right, pro-lockdown and identitarian leftists are peddlers of dangerous authoritarianism, surreptitiously buttressing the Chinese Communist Party’s agenda not just by pushing for longer and stricter lockdowns but by dividing Western culture in a million fragments over debates on trans rights and cancel culture.

To my friends on the left, conservatives are outright complicit in or adjacent to white supremacy, “murderclowns” indifferent to the effects of the virus, and — at least in the US — directly responsible for the attacks of January 6th on the U.S. Capitol building.

This won’t end well. As the philosopher Sam Harris puts it:

“We have a choice. We have two options as human beings. We have a choice between conversation and war. That’s it: conversation and violence.”

Sign up for FREE and receive The Hub’s weekly email newsletter.

You'll get our weekly newsletter featuring The Hub’s thought-provoking insights and analysis of Canadian policy issues and in-depth interviews with the world’s sharpest minds and thinkers.