SUPPORT US

Online Hate

Standing Committee on Justice and Human Rights
feat. Heidi Tworek
May 30, 2019


MEETING DETAILSVIDEO ON PARLVU


Opening Statement

Remarks by Dr. Heidi Tworek, Assistant Professor of International History at the University of British Columbia, Visiting Fellow at the Joint Center for History and Economics at Harvard University, Non-Resident Fellow at the German Marshall Fund of the United States and the Canadian Global Affairs Institute.

Audio of the full morning’s hearing available here.

Thank you, Mr. Chair, and thank you to the committee for the invitation to appear before you today. It is disturbing that we live in a world where hate speech online is rising, where what Whitney Phillips has called the “oxygen of amplification” has elevated extremist views, and where online hate has several times inspired horrific offline violence. I welcome the committee’s careful consideration of how Canada can address these troubling developments.

I have examined both the Canadian and international aspects of this pressing question. Today, I will briefly describe the range of options on the table in other democracies, the considerations of reintroducing section 13, and finally, suggest how to mitigate harmful speech, a non-legal category.

Let me first state the sobering fact that hate speech is not a problem that can be “solved.” Unfortunately, this will be a continual and ongoing threat. Still, levels of hate speech can ebb and flow. Certain circumstances and online ecosystems as well as political, economic, and cultural factors can facilitate more hate speech and more hate-related crime. Likewise, the reverse.

This is an international problem. Democracies around the world are exploring or implementing different approaches. As I detail in my brief, these range from legal to co-regulatory to codes of conduct. In Germany, the NetzDG law requires social media companies to enforce 22 existing statutes of speech law online. The UK has suggested a “duty of care” framework. France has proposed that a regulator require “accountability by design” from large social media companies.

The German NetzDG example is particularly instructive for the potential reinstatement of section 13. Passed in 2017 and in force since January 2018, this Netzwerkdurchsetzungsgesetz is often misunderstood. It did not create or reinstate new categories of speech law in Germany. Rather, as the literal translation of the law’s name shows, it was a “Network Enforcement Law” that required social media companies to attend to posts flagged under NetzDG complaints within 24 hours or face a fine of up to 50 million Euros per post.

The problem in the German case was not existing law, but getting social media companies to enforce existing law. In other words, when considering reinstating section 13, we must also consider its enforceability. Given the scale of complaints, would this effectively privatize enforcement of Canadian law? Will enforcement be national or global? There are multiple further considerations such as appeals processes. Alongside or even in lieu of reinstatement, it is important to consider how to prevent massive backlog and how to deal with potentially hundreds of thousands of complaints (for context, Twitter and YouTube both received over 200,000 complaints under NetzDG in a six-month period).

It is also valuable to move beyond individual pieces of content. How, for example, do we deal with ecosystems and infrastructures that enable the financing of extremist speech? For example, one member of the Canadian far right tried to use GoFundMe to raise money for an appeal against a libel suit. The ruling in that case, written by Ontario Supreme Court Justice Jane Ferguson, called the far-right man’s words “hate speech at its worst.” Only after complaints was the campaign removed. Such fundraising by extremist figures that violates a platform’s terms of service is a common occurrence; the campaigns are often only removed after platforms are alerted by civil society.

This example is one illustration that law is a vital part of these discussions, but we should remember that it covers one small part of harmful speech online.

Much harmful speech is legal but undermines free, fair, and full democratic discourse online. Let me end with three suggestions for how Canada might approach harmful speech without infringing on our democratic right to free expression.

First, Chris Tenove, Fenwick McKelvey, and I have suggested creating a social media council that would mandate regular meetings of social media companies and civil society, particularly marginalized groups that are disproportionately affected by hate and harmful speech online. A social media council could be created explicitly through the framework of human rights. The idea of social media councils is supported by others, including the UN Special Rapporteur on Freedom of Opinion and Expression. By linking to international human rights, Canada could design institutions that enable us to regulate social media and potentially impose national laws or principles, without inadvertently providing justifications for illiberal regimes to censor speech in ways that deny basic human rights.

Second, we can consider what transparency to mandate from social media and online companies. There is much we do not know and much that we cannot currently even investigate. For example, we could require audits to determine if there are discriminatory impacts of social media that violate rights. These audits would go beyond the requirement to act on illegal speech, and monitoring of doing so. Often the platforms themselves don’t even know if their algorithms are facilitating rights abuses – such as landlords using proxy terms to discriminate on finding tenants. Or we could require audits to determine if sites expose different groups to much greater volumes or severity of abuse. This could be akin to the algorithmic impact assessments, where Canada is a world leader in requiring such assessments.

Third, we can support civil society organizations and academics that help those under attack and that research these topics. Hate speech is not solely an issue to be addressed by governments and platforms. Many social media platforms remove content or reduce users’ exposure to it after journalists or civil society organizations have alerted the company to the problem. Beyond the fact that this indicates something problematic about the online ecosystem, which will be hard to change, it implies that such organizations play an important role in combating hate speech. We can also further support research on how to encourage constructive engagement online.

There is much to be done on all sides. Thank you to this committee for inviting me to be part of the conversation.


Be the first to comment

Please check your e-mail for a link to activate your account.
SUBSCRIBE TO OUR NEWSLETTERS
 
SEARCH

HEAD OFFICE
Canadian Global Affairs Institute
Suite 2720, 700–9th Avenue SW
Calgary, Alberta, Canada T2P 3V4

 

Calgary Office Phone: (587) 574-4757

 

OTTAWA OFFICE
Canadian Global Affairs Institute
8 York Street, 2nd Floor
Ottawa, Ontario, Canada K1N 5S6

 

Ottawa Office Phone: (613) 288-2529
Email: [email protected]
Web: cgai.ca

 

Making sense of our complex world.
Déchiffrer la complexité de notre monde.

 

©2002-2024 Canadian Global Affairs Institute
Charitable Registration No. 87982 7913 RR0001

 


Sign in with Facebook | Sign in with Twitter | Sign in with Email