Failure to Balance Freedom of Expression and Protection from Online Harms: My Submission to the Government’s Consultation on Addressing Harmful Content Online

The government’s consultation on its proposed approach to address harmful content online concluded over the weekend. The consultation was one of several consults that ran during the election period and which raise questions about whether policy makers are genuinely interested in incorporating feedback from Canadians. I submitted to all the various consultations and will be posting those submissions this week.

I start with my online harms submission. The full submission, which touches on issues such as 24 hour takedowns, website blocking, proactive monitoring, and enforcement, can be found here. To learn more about the issues, catch my Law Bytes podcast episode with Cynthia Khoo or listen to a terrific discussion that I had together with Daphne Keller on the Tech Policy Press Podcast. The submission opens with eight general comments that I’ve posted below:

1.    The proposed approach does not strike an appropriate balance between addressing online harms and safeguarding freedom of expression. Indeed, after a single perfunctory statement on the benefits of Online Communications Services (OCSs) which says little about the benefits of freedom of expression, the document does not include a single mention of the Charter of Rights and Freedoms or net neutrality. There is surely a need to address online harms, but doing so must be Charter compliant and consistent with Canadian values of freedom of expression. I believe the proposed approach fails to adequately account for the freedom of expression side of the ledger.

2.    Rather than adopting a “made in Canada” approach consistent with Canadian values, the plan relies heavily on policy developments elsewhere. Yet the reality is that those models from countries such as France, Germany, and Australia have met with strong opposition and raised serious concerns of unintended consequences. Indeed, France’s approach has been ruled unconstitutional, Germany’s model has resulted in over-broad removal of lawful content and a lack of due process, and Australia’s framework is entirely unproven. An evidence-based approach would better account for these experiences rather than seek to mirror them.

3.    The proposed approach mistakenly treats a series of harms – spreading hateful content, propaganda, violence, sexual exploitation of children, and non-consensual distribution of intimate images – as equivalent and requiring the same legislative and regulatory response. While there is a commonality between these harms as so-called “illegal speech”, there are also significant differences. For example, it makes no sense to treat online hate as the equivalent of child pornography. By prescribing the same approach for all these forms of content, the efficacy of the policy is called into question.

4.    There are lingering concerns about scope-creep with this proposal. Government officials have previously referenced the need to address “harmful” or “hurtful” comments, raising the prospect of expanding the model far beyond the current five forms of illegal speech cited in the proposal. Moreover, the government has indicated that these rules apply only to OCSs, identifying Facebook, Youtube, TikTok, Instagram, and Twitter as examples. It notes that there will be an exception for private communications and telecommunications such as wireless companies, Skype and WhatsApp (along with products and services such as TripAdvisor that are not OCSs). Yet during a briefing with stakeholders, officials were asked why the law shouldn’t be extended to private communications on platforms as well, noting that these harms may occur on private messaging. Given that the government previously provided assurances of the exclusion of user generated content in Bill C-10 only to backtrack and make it subject to CRTC regulation, there is a need for renewed assurances about the scope of the rules.

5.    The proposed approach envisions a massive new bureaucratic super-structure to oversee online harms and Internet based services. Due process concerns dictate that there be a suitable administrative structure to address these issues. However, some of the proposed models are ill-conceived that will not scale well nor afford the much-needed due process. For example, adjudicating over potentially tens of thousands of content cases is unworkable and would require massive resources with real questions about the appropriate oversight. Similarly, the powers associated with investigations are enormously problematic with serious implications for freedom of the press and freedom of expression.

6.    The proposed approach threatens Canada’s important role as a model for the rest of the world. Some of the proposals risk being deployed by autocratic countries to suppress freedom of expression with Canada cited as an example for why such measures are reasonable. The government should be asking a simple question with respect to many of its proposals: would Canadians be comfortable with the same measures being implemented countries such as China, Saudi Arabia, or Iran. If the answer is no (as I argue it should be), the government should think twice before risking its reputation as a leader in freedom of expression.

7.    The proposed approach also threatens to harm the very groups it purports to protect. Without full due process and with clear incentives to remove content, there are real fears that the rules will be used to target BIPOC communities and vulnerable groups. Those groups could be silenced by a process that is weaponized by purveyors of hate with their voices removed due to poorly conceived rules that do not feature adequate due process.

8.    During the last election campaign, the government promised to move forward within 100 days of its mandate. Given that commitment – as well as the structure of the consultation that reads more like a legislative outline rather than a genuine attempt to solicit feedback – there are considerable doubts about this consultative process. Consultations should not be a box-ticking exercise in which the actual responses are not fully factored into policy decisions. The challenge of reading, processing, analyzing and ultimately incorporating consultation responses within a three month period appears entirely unrealistic. The government should provide assurances that there will be no legislation without taking the consultation responses fully into account.

The post Failure to Balance Freedom of Expression and Protection from Online Harms: My Submission to the Government’s Consultation on Addressing Harmful Content Online appeared first on Michael Geist.

Via RSSMix.com Mix ID 8247009 http://www.rssmix.com/

Comments