Viewpoint

Greg Boland: Regulation is urgent and necessary in the digital media age

If the novel was the reefer of a previous age, social media is now the opioid
Workers help set up the Google booth before CES International on Jan. 4, 2020 in Las Vegas. John Locher/AP Photo

Janet Bufton makes the compelling argument for personal choice in her recent essay at The Hub urging governments to show a little faith in people before regulating social media.

However, in situations where the odds are so stacked against the individual, regulation is both normal and required.

Make no mistake, the government’s Bill C-10 is a misguided paternalistic attempt to shape and control media consumption in Canada. Force-feeding Canadians three healthy servings of the Shania Twain catalogue (notwithstanding she migrated to a tax haven years ago) serves no purpose but to enrich an established artist. But behind these measures is a more sinister motivation by the government- to control and shape what the population reads, hears and believes.

MORE SIGNAL. LESS NOISE. THE HUB NEWSLETTER.

But don’t let that argument distract us from the real problems we are facing in the digital media age. Some form of regulation is urgent and necessary.

If the novel is the reefer of the modern media age, social media is the opioid. Addictive, exponentially more powerful, and impossible to contain. Novels, as Janet Bufton points out, were concerning in their day. However, they were largely works by a single author, which propagated relatively slowly. Slowly enough for society to absorb, and adapt and assimilate the information contained in them. Slowly enough for the information to be self-corrected by debate, critique, fact-checking and discussion.

In contrast, digital media is authored by highly concentrated corporations which are staffed by trained behavioural scientists. The content need not be literally authored by the companies in question. Rather, assembling, curating, editing, censoring, and targeting the information has the same effect.

Skilled entities can manufacture and propagate information faster than society can correct it. This leads to turned elections, bankrupt companies and ruined individuals.

Content is amplified through algorithms powered by detailed surveillance of our personal habits. Children growing up in today’s world will be almost completely profiled by the time they are adults. They will have no chance against artificial intelligence designing content to be manipulative and virulent.

In addition, these platforms are effectively monopolies unlike any other in history. Society, for all intents and purposes, consumes all its information through them.

Of course, censoring the media is dangerous and counter productive to society so what are we to do?

We can at least ameliorate the problem in three ways.

Accountability

Currently social media companies are afforded blanket protections from liability for anything posted by third parties. In contrast, “traditional” media outlets are held accountable for their publications.

Newspapers can be found liable for even editing a story unfairly. The same doesn’t hold for digital platforms. They can assemble and disseminate misleading information without any accountability. Moreover, there is an entire industry taking advantage of this loophole. You can anonymously publish literally anything about a person, propagate it instantly, and never be forced to correct it or take it down. Legal recourse is extremely difficult for the victims who must view the content at the top of their Google search results for eternity.

Teams of data scientists create fake information and reverse engineer Google algorithms and social media platforms for maximum effect.

Regulation can solve this problem by either making the platforms accountable, or eliminating their ability to curate information to target an individual. The original protections that social media companies enjoy never envisioned nefarious actors assembling anti-vaccination information and feeding it only to those scientifically determined to be susceptible.

Finally, accountability would also be available to those who were deplatformed.

Competitive regulation

Part of the problem is due to the lack of alternatives for society. Facebook and Google are reality now.

There are well documented allegations of anti-competitive behaviour that need to be corrected to allow alternative information sources to flourish and counterbalance. As argued in one U.S. lawsuit, “through its campaign of anticompetitive conduct, Google has achieved and maintained a monopoly or near-monopoly in [the] marketplace by erecting a toll bridge between publishers and advertisers and charging an unlawfully high price for passage.” Behind the economic barrier you will see a high hurdle for alternative information.

However, more concerning is the effect of the monopoly on the individual. We are given the Hobson’s Choice of either selling our digital organs for access to the monopoly or living in a pre-digital world. We can force our children to be social outcasts or expose them to pernicious material. Even a platform ostensibly designed for children, such as the live gaming feed site Twitch, now has a “Just Chatting” channel featuring partially clothed girls dancing in a hot tub in exchange for viewer donations (which are streamed along the video like a ticker tape).

Empower the individual

Allowing an individual to control their own data would go a long way to controlling the worst of the problem. Starbucks cannot snoop on your internet traffic while using their WiFi. But Google can record and save for eternity: every search request, every web page visited, your location 24 hours a day, the content of your emails, all the content of Google Docs, your android text messages, and all your activity from “Sign in With Google” including purchases at online retailers.

By the time today’s child is an adult, her entire life will be surveilled to the tiniest detail, starting in Google Classroom. If the AI algorithms determine she is science skeptical, she will be sent a diet rich in anti vaccination propaganda and conspiracy theories. Her search queries will be massaged and manipulated as will her news feed and advertisements. Her reality will be shaped to appeal to her vulnerabilities. This is not low quality content from a novel.

Allowing an individual to own and control their information would solve this issue. Opting out of surveillance would force at least some form of neutrality. “Filter bubbles” would be much less prevalent. 

The obvious counter argument is we won’t get all these services if the platforms can’t retain their current profit model. However, a simple examination of the digital rent extracted from users illustrates that searches would still be provided for a fraction of the profitability currently enjoyed. Moreover, the marginal cost of providing services like this are a fraction of what they were at the inception of the model.

Bill C-10 is designed to depower the platforms and give the power to the government so they can control and shape our lives. The right solution is to hold the platforms accountable, remove their monopoly power, and empower the individual.

Sign up for FREE and receive The Hub’s weekly email newsletter.

You'll get our weekly newsletter featuring The Hub’s thought-provoking insights and analysis of Canadian policy issues and in-depth interviews with the world’s sharpest minds and thinkers.