FREE three month
trial subscription!

‘Toxic innovation’: Ariel Ezrachi and Maurice E. Stucke and on how Big Tech is stifling innovation and harming society

00:00:00
Podcast & Video

In this episode of Hub Dialogues, host Sean Speer is joined by law professors Ariel Ezrachi and Maurice E. Stucke to break down their book, How Big-Tech Barons Smash Innovation―and How to Strike Back. They discuss how the most successful Big Tech companies stifle innovation by delisting or swallowing up competitors, the harms that this is imposing on our broader culture, and why cities are the true sources for innovation in society.

You can listen to this episode of Hub Dialogues on Acast, Amazon, Apple, Google, Spotify, or YouTube. A transcript of the episode is available below.

Transcripts of our podcast episodes are not fully edited for grammar or spelling.

SEAN SPEER: Welcome to Hub Dialogues, I’m your host, Sean Speer, editor-at-large at The Hub. I’m honoured to be joined today by Maurice Stucke and Ariel Ezrachi, who are law professors at the University of Tennessee and Oxford, respectively, and co-authors of the fascinating new book, How Big-Tech Barons Smash Innovation―and How to Strike Back. In the book, they effectively argue that large tech companies have been a drag on the technology ecosystem, as well as overall innovation in the economy, and ought to, therefore, be subject to a dedicated policy response. I’m grateful to speak with them about their thesis and its economic and policy consequences. Maurice and Ariel, thank you so much for joining us at Hub Dialogues, and congratulations on the book.

ARIEL EZRACHI: Thank you. A pleasure to be here.

SEAN SPEER: Let’s start with a biographical question. This isn’t your first co-authored book with one another. How did the two of you come to work together?

ARIEL EZRACHI: Well, quite a few years ago, we met at I think probably one of the conferences, antitrust conferences, and Maurice and I share many views on the state of the economy and markets. And as we were discussing these, we realized that there are some questions that are yet to be explored that both of us are very keen to explore. This is how it started, first in 2016. After a few joint articles, we started with a book called Virtual Competition, where we explored the online environment, and followed this with Competition Overdose, and then now with the Big-Tech barons and innovation book.

SEAN SPEER: Let me ask a conceptual question for our listeners. Your book is about Silicon Valley and big tech, and their economic and social and political consequences. In this story, who is Big Tech? What are its characteristics? And what determines if a company is in or out of your definition?

MAURICE E. STUCKE: Sure. So, one thing is that the primary actors in our book are in Silicon Valley. But one of the points that we make is that the digital barons aren’t necessarily restricted to just the U.S. That you can have these tech barons in other countries such as China as well. So what got us started was that there was a saying that apps are worth millions, and platforms are worth billions. But on top of platforms, are these entities that control ecosystems. And Google’s CEO mentioned that. They said that they don’t create platforms, they create ecosystems. We wanted to understand what’s the difference? What’s the difference? When you own let’s say, a popular platform, such as TikTok. versus Facebook that controls several interlinked platforms? And what additional power does that give them? And so in our story, then you have the Big-Tech barons that control these ecosystems. So we first define what’s so special about ecosystems, and why it gives them so much power. And then, are they the true innovators? We look at the Big-Tech barons, and then we come up with the tech pirates, and we discuss the difference between the two.

SEAN SPEER: It’s increasingly noncontroversial that the rise of Big Tech has had profound social and political consequences. But it’s often assumed that these are the trade-offs of significant innovation and broader economic value. Yet you argue that this isn’t right either. That not only have we gotten these negative social and political consequences, but we also haven’t gotten much economic gain. Help our listeners understand this point. Why, in your view, does Big Tech represent a model of innovation that primarily extracts value rather than creates it?

ARIEL EZRACHI: So that’s really a key point that we make. When you look at innovation, you can classify innovation. You can look at sustaining innovation where you try to improve products and services that you offer your clients and consumers in the moment. And you can look at disruption. You trying to do, or offer, new technologies, new processes that will benefit society. And generally speaking, there is an assumption that Big Tech companies offer us innovation across the board. Even more than that, there is an assumption that the ecosystems that they build act as magnets to others and encourage further innovation. Maurice and I, once we started looking into that—and we began because we were commissioned by the European Commission to look into these issues and later on we continued with our research—what we started to see was that whereas indeed, Big-Tech companies invest heavily in innovation in terms of sustaining innovation, and also invest in disruption outside their ecosystem, at the same time they make a very clear effort and engage in many strategies with the aim of crushing innovation that disrupts their ecosystems.

So alongside the benefits that they offer us, there is an increased process in which what they tried to do is to block disruption that affects their profit-making machine, that affects their value chain. And as those ecosystems grow, basically, those digital empires get larger and larger, that incentive becomes more and more significant to us as society. What you consider to be a relatively small platform could have an impact on all those innovators that wish to enter that platform. It can have an impact on us as consumers in the way that we actually become aware of innovation. But it also affects funding, it affects the whole process. This is what we’re trying to point out. That once you appreciate the link between market power and the impact on innovation, you start to see that we’re not actually getting the promised innovation. We’re getting much less.

MAURICE E. STUCKE: So I just wanted to add that one point to illustrate this that we have in our book is about Facebook’s average revenue per user. And in the fourth quarter of 2011, they made roughly $3.20 per user in the U.S. and Canada. That grew in the fourth quarter of 2020 to over $53. That’s a 16-fold gain. So one wonders, has your experience on Facebook increased 16 times? It’s highly unlikely. So who’s capturing that value? And is that value necessarily helping us, like Ariel mentioned, or is it more now just extracting value?

When you couple that with the Facebook files published by The Wall Street Journal, you could see then, that the innovations aren’t necessarily helping us. They’re in fact becoming more addictive. They’re in fact hurting our well-being. This has tangible effects on particularly teenage users on Instagram, and Facebook knows this. That’s a nice case in point that you’ve got the 16-times increase in revenues per user, and it’s not innovation that is creating value for the user, instead what we see is that it’s extracting value, and also for the user, it can be destroying value.

SEAN SPEER: That’s a really helpful illustration, Maurice. Let me pick up the point about the place of these firms in their respective tech ecosystems and the role that they play in quashing innovation and disruption as you characterize it. What are some of the ways in which the emergence of these small number of massive tech firms has come to harm innovation and block disruption? Can you give us some indication of how they go about doing that?

MAURICE E. STUCKE: Sure. So what we focus on is that in creating the ecosystem, they then have access to unparalleled data. They then have a weapon that earlier monopolies lacked. We call it the nowcasting radar. They can identify nascent competitive threats well before others can. And Facebook, one of the executives called their nowcasting radar the gift that keeps on giving. It was, ironically a privacy app that they used to help identify WhatsApp among the potential threats.

So once they identify a threat to their value chain, what we then chronicle is that they have multiple tools to effect both the demand and supply of innovation. Some of the ways that they can effect the demand of innovation is by steering us on a path away from innovations that could be potentially disruptive to ones that could be actually helpful. And Ariel came up with a really nice analogy, that he can share that kind of encapsulates that, about the birds in England.

ARIEL EZRACHI: We sometimes all have an idea of autonomy and we believe that we benefit from the power to determine the path that we take online. That you determine where you’re heading, you determine what you click and what you purchase or what you read. Whereas in fact, once you appreciate the ability to manipulate us, you realize that many times you will walk on a path that was prepared for you by the platform that leads you in a very specific direction. You walk on that path with a sense of autonomy without being aware of all the mechanisms behind the thing that is being deployed to do that. It’s interesting that there is this biologist who told us this amazing story about how birds that used to migrate to Africa started migrating to the U.K. in the winter, although it didn’t make any sense. And the reason was that in the U.K., people started feeding them. This is a small anecdote but a beautiful one, because you realize that in many ways, we are no different than those birds. Whereas naturally, you might have gone to a certain platform or might have used a certain payment system, once someone puts enough food for you elsewhere, when someone is being able to manipulate the way that you see the reality, we just migrate elsewhere.

You see that with, I mentioned, payment systems, with the payment systems that we use for in-app purchases, you see it when it comes to technologies that we use, you see it when it comes to privacy applications that you might download. All of these are not really determined by an ideal open market where we get the best. They are determined by our understanding of what is available. They are determined by the friction that is created between you and those technologies. For example, I don’t know how many of our listeners try to sideload an application. It’s actually quite difficult to sideload an application. So by throwing an application out of an app store, whereas in theory, if that is a great application, you would expect it to succeed regardless. In reality, creating friction means that we don’t really go there, we deprive the new technologies of scale. And this is how the type of innovation that we get is very much determined by those who control the ecosystems. The same would go for search engines. Almost any platform that you use, any gate to a market is not just the gate. Someone is orchestrating the environment. And in doing that, that company controls the future of innovation.

SEAN SPEER: We’ll come back to both some of the explanations for why we haven’t seen policy and regulatory progress as well as some of your recommendations of what policymakers in the U.S. and elsewhere ought to do. But let me just ask two questions before we get there. The first picks up something you just discussed, which is a consequence of this influence that Big Tech has had and is having on the tech ecosystem and the economy as a whole is what you describe as “toxic innovations.” Why don’t you just unpack what that means and how these toxic innovations are influencing our economy, society and politics?

MAURICE E. STUCKE: It’s only natural, that if you control an ecosystem, you really have a couple of different incentives. First, for any innovation that happens within your ecosystem, you want it to be sustaining to help reinforce your ecosystem. So it’s sort of building the coral reef, but that’s the coral reef that already exists. So a good example is like Google is investing now in the next generation of search. It’s going to go from the keyboard to voice, but it’s not going to change the fundamental value chain. Google still will make a large portion of the profits from search advertising, whether it’s voice or whether it’s using your desktop or using your phone. The innovation will happen, but it will just reinforce then the tech barons’ power. Innovation outside of their ecosystem can be disruptive. And there, you could see, let’s say the tech barons, let’s say investing in health care. But what’s going to happen? It’s going to be disruptive, but only to the extent that it doesn’t jeopardize the power within their ecosystem. As they become larger and more powerful, there are going to be fewer opportunities for others to come up with truly disruptive innovations that add value. What you’re gonna have is innovation, but it’s going to be more tightly controlled. It’s going to help reinforce the Big-Tech barons’ power.

ARIEL EZRACHI: And maybe to add to that, I think it’s interesting that our policymakers and politicians often take a quantitative approach so they are impressed when they see the numbers. And indeed, those are impressive numbers. When you see the amount that is invested by these Big Tech companies, they often invest in a wide range of technologies. But once you move beyond the quantitative aspect, and you look at a qualitative dimension and you ask yourself, “Where does this money go? What is it that they invest in?” you start to realize that because of market power, those companies don’t necessarily invest in innovation that we crave, that we consumers citizens want, they invest in innovation that serves their needs.

Something happened with market power. There is a detachment between the market, us, and the reality. So the innovation, the investment that we see at the moment, is driven by the value chain, it is driven by the need to profile us, by the need to target us, by the need to satisfy the advertisement machine. A lot of money is poured in those directions. But if you ask consumers, if you ask citizens, when they see the overall picture, do they benefit from that innovation? Ask yourself, when you look at our democracy, do we benefit from targeting? Do we benefit from the fact that negativity is rewarded online? No, we don’t. And yet, at the quantitative level, some policymakers are still impressed. Some policymakers are still really taken by this ideological platter that they are served day and night by the Big-Tech companies. They are convinced that the future is there. Any regulation will signal the end of whatever it is that they want to have. Whereas the reality is, once you look at the quality of innovation, once you look at what we get, and the impact on society, we’re losing. We’re not gaining much.

SEAN SPEER: Okay, that’s a pretty devastating answer. But while Big Tech, probably doesn’t have a lot of defenders, one argument that you often hear is that if policymakers broke up these companies, we might end up with Chinese firms dominating these technologies and strategic areas, including software and the internet. That is to say, Western firms need scale to compete with their state adjacent competitors. So while Google and Facebook may not be great, they’re better than their Chinese alternatives. How would you respond to that argument?

MAURICE E. STUCKE: When we talk about toxic innovation, we’re talking about innovation that can either extract or destroy value. We look at behavioural advertising, and that’s one area, but the tools of behavioural advertising are now being exported elsewhere, including into politics, elections, and just surveillance generally. Behavioural advertising, when Ariel and I would go to conferences, there would be invariably someone from Google, who would say, you know, I like behavioural advertising because I don’t want to get porn ads. And it was always the choice of, you can get porn ads, or you get behavioural advertising. That was the alternative. And no one really wants to get porn ads. You wonder like, wow, if we didn’t have behavioural advertising, we would just be inundated with porn ads. In fact, one thing is that the the Big-Tech barons are also squeezing out the pornography industry, but putting that aside, what behavioural advertising’s pitch to us is, is giving us more relevant ads. That seems actually benign. But that’s not what’s going on. Because that’s not where the profits are.

The profits are going from predicting behaviour, to manipulating behaviour. And it’s not just manipulating behaviour, it’s predicting emotions and manipulating emotions. What Ariel and I did was, we went through some of the innovations by these Big-Tech barons, and what they’re focusing on is really a new frontier: understanding why you cry. Because if you understand why that is happening, what may have been prompting that, then they can then manipulate those emotions and get them the proper response. I’s just to get you to buy things that you otherwise wouldn’t have wanted at the highest price that you’re willing to pay.

The same tools are now being used to manipulate other types of behaviour, like whom you vote for. Cambridge Analytica was basically using some of the same technologies that are available in the behavioural advertising world to manipulate voter behaviour, to get people perhaps to vote for a particular candidate. But if they can’t be flipped, to persuade them not to. That has a real impact on our democracy. That’s just one layer. The other layer is that the tech platforms are driven on sustaining our attention in order to manipulate behaviour. To sustain our attention, they have to then find the right measure of emotions to keep us attuned. What the political parties are doing now is in order to stay relevant, they’re catering their messages to the Big Tech barons’ algorithm. Like Facebook, as Facebook algorithm favours more negative news stories, the political parties have become more negative. And this is not just our speculating this, this is like from the internal studies by Facebook where they’re seeing the impact that their algorithm is having on political parties around the world.

ARIEL EZRACHI: The argument takes an all-or-nothing approach. So it’s either we break everything, and we go to a non-existing economy, or we have to keep things as they are. This is, of course, a straw man to convince us that because of that, we must accept that whatever is the position at the moment, this is the right position. But that’s not the case. So before we even approach the question of whether we should, or we ought to, break some of those Big-Tech companies, we should just appreciate that there is a scale of remedies that are possible. Some of those remedies are relatively mild. Of course, they might affect profitability, but we balanced that against the ripple effects that those companies are generating. The starting point is to appreciate what are the effects that those companies are creating, or externalizing on us, on society, on individuals. Once you appreciate that, we look at the levers that are used to create these effects. And once you appreciate those levers, you can then start to formulate some ideas on what are the right policy measures or enforcement measures.

It could be, for example, informing users of when they are trapped. Rather than the existing stealth position where we’re unaware of half of what is happening, inform us. It could be better control over data. It could be about privacy. It could be about opening markets. It could be about preventing companies from delisting disruptors. So there is such a vast toolbox that can be used. And it is just that when you have those discussions, sometimes those tech companies ignore those toolboxes, and they go to the most extreme measure because that’s an easy argument to make.

Now, that extreme measure might be necessary at times when the toolbox is ineffective. So it is possible that you will have a company where you will decide that that company is too big, or that company is too problematic for society, for politics. That argument might not necessarily come out of an antitrust debate, where we are very concerned about chilling competition, chilling innovation, it might come from a political perspective. It might come from fear that governments and societies have from private power that can distort our democratic system. Whether you favour it or not is a rather complex question. But I think what is obvious is that, if that will happen, it will be in a very select number of cases, dealing with very specific functions. It’s certainly not going to be something that will impact in the way that some companies are trying to present.

Even more than that, it is actually quite important to appreciate that when you look at the income of those companies and their taxpaying, you actually realize that whereas they present themselves as the savers of the U.S. economy, they’re not channeling that much income as taxpayers. Amazon aside in terms of employing so many employees across the U.S., the others are often based elsewhere and the income is gained elsewhere. So the U.S. economy doesn’t benefit at the extent to which it is presented in the press. I think it’s very important to try and depart from those slogans and move into a more nuanced debate, and in that nuanced debate, personally, I would argue that there is a massive toolbox that can be utilized to retain all the benefits that we gain from Big Tech companies, while protecting us, and while making sure that society is getting the benefits but is not paying the price for what I would refer to as greed.

SEAN SPEER: You talked a bit there about the policy toolkit. The book refers to three focal points as a conceptual framework for thinking about reform. Do you want to just unpack what those focal points are for our listeners?

MAURICE E. STUCKE: Sure, what we talk about is it’s “V”, “I”, “D”. It’s first looking at the value of innovation. And we normally assume that if there’s a market demand for innovation, it must have positive value. But that’s not the case. There could be, for example, cyberstalking technology that might provide value for some people, but doesn’t provide value for many. So first you have to look at the value of innovation. And it’s sort of tricky for a regulator to say, “Well, how much value will this innovation provide us?”

The first concept is related to the second, which is the incentives. The problem is that we often look at the behaviour of the Big-Tech barons, but we don’t look at the underlying incentives. Why do they behave in that way? The FinTech barons are not necessarily evil people. Many of the people there have good reason to work there. Most people want to contribute to the greater good, I would imagine. And some of the Facebook internal surveys show that there’s alarm within companies about senior management and what they’re doing. But the incentives of the ecosystem reward that behaviour. That Facebook, for example, can offer algorithms that will counter the negative aspects of its primary algorithm. But so long as the primary algorithm is creating the money, that’s what they’re going to do. If you break up Facebook, you’re gonna then have TikTok who’s going to have similar incentives. So you got to look at incentives.

Then the final aspect is diversity. It’s always a bad bet, to rely on a few powerful companies for innovation. That has never been the model of the United States or most other countries. Historically, we relied on many startups, and some of those startups will fail. But those that then succeed can be quite disruptive and create significant value. So using those three principles, we then turn to how can we promote Innovation.

Some of the things that we talked about is what you have already seen today like Europe’s Digital Markets Act. And that’s absolutely a step forward as our book discusses, but then it’s like shooting ducks. You can’t shoot where the duck is, you’ve got to shoot where the duck is going. The problem with Europe’s proposals are that they’re all responding largely to the past behaviour of these tech barons. And the tech barons aren’t necessarily going to rely on that. So you’re going to need new tools as well. But besides from strategies to deter that behaviour, what are some ways that you can then encourage new innovation, and then we talked about investment in basic research. The most surprising thing in the book for us was investing in cities. That is sort of the unheralded generator of innovation. Ariel and I are really excited about cities. So, Ariel, why don’t we talk a little bit about cities and the importance that they can have an innovation?

ARIEL EZRACHI: Yeah, that’s a really nice point. Because when we think about the way forward, we think about different types of policies, and one way would be optimizing policies where you try to improve your antitrust laws, your privacy laws. You try to really take the three values that Maurice mentioned and deploy them in a way that makes our policies wiser and smarter. The other area is the support policies where we are trying to divert our attention, our funding, our active steps in the right way. It’s very interesting once you look at cities and you realize how cities grow in a way that is much more conducive to innovation than a firm. So when a city doubles in size, what you have is a real change in the innovation that is produced by the city, whereas a company when a company grows, you don’t see that.

Whereas we as society, we became accustomed to viewing some of the giant companies as the source of innovation. And by the way, a lot of research, highlights how those companies might be the source of some innovation, but will often just purchase other companies that are the real disruptors. Once you move away from that vision, to understand the role of cities, you realize the cities are in fact, the engine for innovation. There are a few examples that we discuss in the book of cities that grow in size, and as they grow in size, because of the infrastructure that is created, they become hubs of innovation.

We try to use that example of cities to help our readers, to help policymakers, to shift the focal point and appreciate that if you imagine that the engine for future innovation is not necessarily the big tech companies but it is those who create infrastructure that brings together people, that brings together other companies, then you can really maybe unlock the key for future disruption and future innovation from the idea of smart cities that don’t have to focus on a single provider, a single platform company that will basically reduce the diversity of innovation that we get through the actual investment of cities in innovation. Maurice, do you want to add on cities?

MAURICE E. STUCKE: Yeah, the one thing that we also found with cities is, first of all, they’re less likely to be captured because cities have an incentive to promote diversity. And in fact, the business literature points out that regional economic clusters that are more diverse are actually more durable and they provide more innovation. We don’t know exactly why that’s the case. And there are a lot of things here we don’t know exactly why. Like, why is it when you generally double the size of a city, you get not just 100 percent more patents, but 115 percent more patents. Whereas when you double the size of the company, you’re gonna get less than 100 percent more patents. But one of the things that the literature points to is that there are these sort of positive externalities with these regional innovation clusters. That they’re involved in basic research from the universities that help support them. You then get a trained workforce and you have the mobility of the workforce to move around it. And then also you have like, cities where you have different areas of expertise and you have the clash of ideas that people, you know, an artist might collide with a playwright, and then they’ll collaborate. That’s the sort of innovation that you can have as well. And it’s really the unsung hero.

Ariel and I have been in this game now for over 25 years. We’ve been to innumerable conferences. Everyone agrees that innovation is the most important thing for an economy. There’s no dispute about that. But we never heard anything about cities. It was Joffrey West who wrote this book, Scale, where he came out with the findings. We started poking around and we found business literature that supported it as well. But it was a complete disconnect. No one has ever mentioned it before.

SEAN SPEER: That’s fascinating. Your description of the playwright and the artist reminds me of Matt Ridley’s line about how innovation is ultimately about ideas having sex, which requires that kind of collaboration and combustion that you talk about.

Let me ask a final question. The two of you have been so generous with your time. If listeners are persuaded by your arguments, both about the need to sort of reconceptualize the role of Big Tech in our economies and society in one hand, and to pursue a different approach to innovation that is what one might describe as city-centric on the other, the question is: why aren’t policymakers moving in these directions? And what do you think the prospects are for reform along these lines in the U.S., Canada, and elsewhere?

ARIEL EZRACHI: I think that when you look around, you realize that policymakers are changing the way that they view markets, there’s no doubt about it. But you also realize how there are still pockets when it comes to politics, or policymaking, of people, or pressure groups, or lobbyists that are investing a lot of effort and a lot of money in trying to sustain the ideological platter, and to sustain the idea that any move against Big Tech would really devastate the economy, or will have other devastating effects on society or politics. But we do see a difference. We’ve seen already a shift when it comes to antitrust enforcement. So there, I think the U.S. radically changed its position, especially if you compare it to where we are in Europe, where the U.S. has now aligned with Europe and even went further with some proposals.

But one area where we are yet to see that change, I think, is when it comes to innovation. Because when it comes to innovation, there is still this underlying assumption that everything is deliberate. There is still this quantitative assumption that if a company invested billions in research and development, that must be good without appreciating that there is an element here of quality of innovation. There is an element here of value chains, there is an element here of plurality. And this is likely to be the next frontier. It might not be easy. It might take a while. It also took a while, by the way, when Maurice and I released Spiritual Competition. We had, I would say, a lot of instances where we would present our ideas about the need to change the way antitrust is enforced in the digital economy and it wasn’t immediately apparent to some policymakers. The work of many think tanks, and international organizations, and other authors, and us that led to that change. And it is likely to be the case that once people appreciate the more nuanced approach to innovation, and also appreciate the devastating impact that we have at the moment when we allow a handful of companies to determine the future paths of innovation. It is likely to be the case that once they appreciate that we will also see a change on that front.

MAURICE E. STUCKE: I’d just like to take up the prospect of reform. In Europe, it’s seemingly brighter because they’re doing things they’re enacting the Digital Markets Act, the Digital Services Act. They’re also proposing the DATA Act and in the United States we have several bills to help rein in these Big-Tech barons, and they haven’t yet even received a vote in either the full Senate or the House despite John Oliver doing an episode about this. It can really seem bleak in the United States. But my analogy here would be the Berlin Wall. That it was a fact of life. It was almost accepted for decades that that was just that, until one day it wasn’t. The Berlin Wall didn’t fall down because some policymakers got together. It was really, ultimately, the people. It was the uprising of the people to demand change. And I think here in the United States and Canada, you have the interest of these members of Congress, but they’re just powerful. We don’t have to accept this current status quo.

We can demand change and we should demand change. It will only come about by telling policymakers, “Why aren’t we enacting further proposals to give us the innovation that’s actually going to help us, that’s not going to undermine our children’s well-being, that’s not going to give them thoughts of suicide, that’s not going to create a Metaverse where all of the horrors that you see already on Instagram are now going to be depicted even more horrendously in this virtual reality?” That’s not a reality that we have to accept, but it’s only if we demand change.

SEAN SPEER: Well, for those who would like a picture of a different reality, they ought to read How Big-Tech Barons Smash Innovation―and How to Strike Back. Maurice Stucke and Ariel Ezrachi, thank you for joining us at Hub Dialogues.

The Hub Staff

The Hub’s mission is to create and curate news, analysis, and insights about a dynamic and better future for Canada in a single online information source.

00:00:00
00:00:00