Picking Up Where Bill C-10 Left Off: The Canadian Government’s Non-Consultation on Online Harms Legislation
The Canadian government released its plans yesterday for online harms legislation with a process billed as a consultation, but which is better characterized as an advisory notice, since there are few questions, options or apparent interest in hearing what Canadians think of the plans. Instead, the plans led by Canadian Heritage Minister Steven Guilbeault pick up where Bill C-10 left off, treating freedom of expression as a danger to be constrained through regulations and the creation of a bureaucratic super-structure that includes a new Digital Safety Commission, digital tribunal to rule on content removal, and social media regulation advisory board. When combined with plans for a new data commissioner, privacy tribunal, and the expanded CRTC under Bill C-10, the sheer amount of new Internet governance is dizzying.
While there is clearly a need to address online harms and to ensure that Internet companies are transparent in their policies, consistent in applying those policies, and compliant with their legal obligations, this proposed legislation goes far beyond those principles. The government has indicated that these rules apply only to Internet services (dubbed Online Communications Services or OCSs), citing Facebook, Youtube, TikTok, Instagram, and Twitter as examples. It notes that there will be an exception for private communications and telecommunications such as wireless companies, Skype and WhatsApp (along with products and services such as TripAdvisor that are not OCSs). Yet during a briefing with stakeholders, officials were asked why the law shouldn’t be extended to private communications on platforms as well, noting that these harms may occur on private messaging. Given that the government previously provided assurances of the exclusion of user generated content in Bill C-10 only to backtrack and make it subject to CRTC regulation, the risk that it could once again remove safeguards for basic speech is very real.
The perspective on OCSs is clear from the very outset. After a single perfunctory statement on the benefits of OCSs which says little about the benefits of freedom of expression – the document does not include a single mention of the Charter of Rights and Freedoms or net neutrality – the government proceeds to outline a series of harms, including spreading hateful content, propaganda, violence, sexual exploitation of children, and non-consensual distribution of intimate images. The proposed legislation would seek to address these forms of harmful content through a myriad of takedown requirements, content filtering, complaints mechanisms, and even website blocking.
How does the government intend to address these harms?
The general obligations would include requiring OCSs to implement measures to identify harmful content and to respond to any content flagged by any user within 24 hours. The OCSs would be required to either identify the content as harmful and remove it or respond by concluding that it is not harmful. The OCSs can seek assistance from the new Digital Safety Commissioner on content moderation issues. The proposed legislation would then incorporate a wide range of reporting requirements, some of which would be subject to confidentiality restrictions, so the companies would be precluded from notifying affected individuals.
The government envisions pro-active monitoring and reporting requirements that could have significant implications. For example, it calls for pro-active content monitoring of the five harms, granting the Digital Safety Commissioner the power to assess whether the AI tools used are sufficient. Moreover, the OCSs would face mandatory reporting requirements of users to law enforcement, leading to the prospect of an AI identifying what it thinks is content caught by the law and generating a report to the RCMP. This represents a huge increase in private enforcement and the possibility of Canadians garnering police records over posts that a machine thought was captured by the law.
In order to enforce these rules, the public could file complaints with the Digital Safety Commissioner. The new commissioner would be empowered to hold hearings on any issue, including non-compliance or anything that the Commissioner believes is in the public interest. The Digital Safety Commissioner would have broad powers to order the OCSs “to do any act or thing, or refrain from doing anything necessary to ensure compliance with any obligations imposed on the OCSP by or under the Act within the time specified in the order.” Moreover, there would also be able to conduct inspections of companies at any time:
“The Act should provide that the Digital Safety Commissioner may conduct inspections of OCSPs at any time, on either a routine or ad hoc basis, further to complaints, evidence of non-compliance, or at the Digital Safety Commissioner’s own discretion, for the OCSP’s compliance with the Act, regulations, decisions and orders related to a regulated OCS.”
In fact, the inspection power extends to anyone, not just OCSs, if there are reasonable grounds that there may be information related to software, algorithms, or anything else relevant to an investigation.
The proposed legislation includes administrative and monetary penalties for non-compliance, including failure to block or remove content. These penalties can run as high as three percent of global revenue or $10 million. If there is a failure to abide by a compliance agreement, the AMPs can run to $25 million or five percent of global revenues. The AMPs would be referred to the new privacy tribunal for review. Given that liability for non-compliance could run into the millions, companies will err on the side of taking down content even it there are doubts that it qualifies as harmful.
If the OCS still doesn’t comply with the order to remove certain content, the proposed legislation introduces the possibility of website blocking with orders that all Canadian ISPs block access to the online communications service. The implications of these provisions are enormous, raising the likelihood of creating a country-wide blocking infrastructure within all ISPs with the costs passed on to consumers in the form of higher Internet and wireless bills. Moreover, the proposal is the answer to those who may argue that Canada does not have the power to compel this level of content blocking on foreign services as the government says it will simply order those services blocked from the country if they fail to abide by Canadian content takedown requirements.
Where a company declines to take down content, the public can also file complaints with the new Digital Recourse Council of Canada. This regulatory body would have the power to rule that content be taken down. Hearings can be conducted in secret under certain circumstances. Layered on top of these two bodies is a Digital Safety Commission, which provides support to the Commissioner and the complaints tribunal.
Who pays for all this?
The Internet companies of course. The proposed legislation will create new regulatory charges for OCSs doing business in Canada to cover the costs of the regulatory structure as the companies will pay for the Digital Safety Commissioner, the Digital Recourse tribunal, and the Digital Commission. As part of the payment requirements, the Digital Safety Commissioner can demand financial disclosures from OCSs to determine ability to pay and Canadian revenues.
Far from constituting a made-in-Canada approach, the government has patched together some of the worst from around the world: 24 hour takedown requirements that will afford little in the way of due process and will lead to over-broad content removals on even questionable claims, website blocking of Internet platforms that won’t abide by its content takedown requirements, a regulatory super-structure with massive penalties and inspection powers, hearings that may take place in secret in some instances, and regulatory charges that may result in less choice for consumers as services block the Canadian market. Meanwhile, core principles such as the Charter of Rights and Freedoms or net neutrality do not receive a single mention.
The government says it is taking comments until September 25th, but given the framing of the documents, it is clear that this is little more than a notification of the regulatory plans, not a genuine effort to craft solutions based on public feedback. For a government that was elected with a strong grounding in consultation and freedom of expression, the reversal in approach could hardly be more obvious.