Taylor Owen speaks at the third annual Simces and Rabkin Family Dialogue on Human Rights, Nov. 9. (screenshot)
Canada, like most of the world, is behind in addressing the issue of hate and violence-inciting content online. In attempting to confront this challenge, as the federal government will do with a new bill in this session of Parliament, it will be faced with conundrums around where individual freedom of expression ends and the right of individuals and groups to be free from hateful and threatening content begins.
The ethical riddles presented by the topic were the subject of the third annual Simces and Rabkin Family Dialogue on Human Rights, Nov. 9, in an event titled Is Facebook a Threat to Democracy? A Conversation about Rights in the Digital Age.
The annual dialogue was created by Jewish Vancouverites Zena Simces and her husband Dr. Simon Rabkin. It was presented virtually for the second year in a row, in partnership with the Canadian Museum for Human Rights.
The featured presenter was Taylor Owen, who is the Beaverbrook Chair in Media, Ethics and Communications, the founding director of the Centre for Media, Technology and Democracy, and an associate professor in the Max Bell School of Public Policy at McGill University. He presented in conversation with Jessica Johnson, editor-in-chief of The Walrus magazine.
The advent of the internet was seen as a means to upend the control of a society’s narrative from established media, governments and other centralized powers and disperse it into the hands of anyone with access to a computer and the web. Instead, as the technology has matured, online power has been “re-concentrating” into a small number of online platforms like Facebook, Twitter and YouTube, which now have more global reach and cultural power than any preexisting entity.
“Understanding them and how they work, how they function, what their incentives are, what their benefits are, what their risks are, is really important to democratic society,” said Owen.
These are platforms that make money by selling ads, so it is in their interest to keep the largest number of people on the platform for the longest time possible, all while collecting data about users’ behaviours and interests, Owen said. These demands prioritize content that is among the most divisive and extreme and, therefore, likely to draw and keep audiences engaged.
The sheer volume of posts – in every language on earth – almost defies policing, he said. For example, in response to public and governmental demands that the company address proliferating hate content and other problematic materials, Facebook has increased resources aimed at moderating what people post. However, he said, 90% of the resources dedicated to content moderation on Facebook are focused on the United States, even though 90% of Facebook users are in countries outside of the United States.
A serious problem is that limitations on speech are governed by every country differently, while social media, for the most part, knows no borders.
Canada has a long precedent of speech laws, and Parliament is set to consider a controversial new bill intended to address some of the dangers discussed in the dialogue. But, just as the issues confounded easy answers in the discussion between Owen and Johnson, attempts to codify solutions into law will undoubtedly result in fundamental disagreements over the balancing of various rights.
“Unlike in some countries, hate speech is illegal here,” said Owen. “We have a process for adjudicating and deciding what is hate speech and holding people who spread it liable.”
The United States, on the other hand, has a far more libertarian approach to free expression.
An example of a country attempting to find a middle path is the approach taken by Germany, he said, but that is likely to have unintended consequences. Germany has decreed, and Owen thinks Canada is likely to emulate, a scenario where social media companies are liable for statements that represent already illegal speech – terrorist content, content that incites violence, child exploitative content, nonconsensual sharing of images and incitement to violence.
Beyond these overtly illegal categories is a spectrum of subjectively inappropriate content. A single media platform trying to accommodate different national criteria for acceptability faces a juggling act.
“The United States, for example, prioritizes free speech,” he said. “Germany, clearly, and for understandable historical reasons, prioritizes the right to not be harmed by speech, therefore, this takedown regime. Canada kind of sits in the middle. Our Charter [of Rights and Freedoms] protects both. The concern is that by leaning into this takedown regime model, like Germany, you lead platforms down a path of over-censoring.”
If Facebook or YouTube is threatened with fines as high as, say, five percent of their global revenue if they don’t remove illegal speech within 24 hours, their incentive is to massively over-censor, he said.
Owen said this will have an effect on the bottom line of these companies, just as mandatory seatbelts in cars, legislation to prevent petrochemical companies from polluting waterways and approval regimes governing the pharmaceutical industry added costs to those sectors. Unfortunately, the nuances of free speech and the complexities of legislating it across international boundaries make this an added burden that will probably require vast resources to oversee.
“It’s not like banning smoking … where you either ban it or you allow it and you solve the problem,” said Owen. There are potentially billions of morally ambiguous statements posted online. Who is to adjudicate, even if it is feasible to referee that kind of volume?
Rabkin opened the dialogue, explaining what he and Simces envisioned with the series.
“Our aim is to enhance the understanding and create an opportunity for dialogue on critical human rights issues, with the hope of generating positive actions,” he said.
This year’s presentation, he said, lies at a crucial intersection of competing rights.
“Do we, as a society, through our government, curtail freedom of expression, recognizing that some of today’s unsubstantiated ideas may be tomorrow’s accepted concepts?” he asked. “Unregulated freedom of speech, however, may lead to the promulgation of hate towards vulnerable elements and components of our society, especially our children. Do we constrain surveillance capitalism or do we constrain the capture of our personal data for commercial purposes? Do we allow big tech platforms such as Facebook to regulate themselves and, in so doing, does this threaten our democratic societies? If or when we regulate big tech platforms, who is to do it? And what will be the criteria? And what should be the penalties for violation of the legislation?”
Speaking at the conclusion of the event, Simces acknowledged the difficulty of balancing online harms and safeguarding freedom of expression.
“The issue is, how do we mitigate harm and maximize benefits?” she asked. “While there is no silver bullet, we do need to focus on how technology platforms themselves are structured. Facebook and other platforms often put profits ahead of the safety of people and the public good.… There is a growing recognition that big tech cannot be left to monitor itself.”
The full program can be viewed at humanrights.ca/is-facebook-a-threat-to-democracy.