Regulating What Canadians See Online: Why Bill C-10 Would Establish CRTC-Approved TikTok, Youtube and Instagram Feeds
The uproar over Bill C-10 has rightly focused on the government’s decision to remove safeguards for user generated content from the bill. Despite insistence from Canadian Heritage Minister Steven Guilbeault that users will not be regulated and Prime Minister Justin Trudeau that users will not be required to make Cancon contributions, the reality is that the removal of Section 4.1 from the bill means that all user generated content is treated as a “program” under the Act and therefore subject to regulation by the CRTC.
That regulation is extensive and can include “discoverability” requirements that would allow the regulator to mandate that platforms prioritize some users’ content over others. Section 9.1(1)(b) of the bill states:
The Commission may, in furtherance of its objects, make orders imposing conditions on the carrying on of broadcasting undertakings that the Commission considers appropriate for the implementation of the broadcasting policy set out in subsection 3(1), including conditions respecting
(b) the presentation of programs for selection by the public, including the discoverability of Canadian programs;
Since the government is now treating user generated content as a program under the Act, this effectively reads that the CRTC can establish conditions respecting the presentation of user generated content for selection by the public, including the discoverability of user generated content.
This aspect of the CRTC powers and the government’s plans has not received much attention, but it raises the prospect of CRTC-approved feeds for services such as TikTok, Youtube, and Instagram. The government has said it plans new amendments that will address concerns about regulating user generated content, but it has also maintained that it wants to retain the discoverability requirements. Indeed, the Prime Minister specifically referenced those requirements in the House of Commons yesterday.
The government is trying to have it both ways, arguing that it doesn’t want to regulate user generated content and then proceeding to regulate it by establishing conditions on what content users may access in their social media services. This has direct implications for free expression as it will fall to a regulator to determine which speech is prioritized online. As David Fraser recently noted, “any regulation of how a platform presents expressive content to an audience implicates the content itself.”
I’ve written previously about the claims related to discoverability in Bill C-10, including the lack of evidence that there is a discoverability problem (the Yale report found very little) and the fact that finding Canadian content on a service such as Netflix only requires typing Canada into a search box. Yet beyond the ease with which Canadian content can be found on audio and video-on-demand services, no one – literally no other country – thinks that mandating domestic content requirements on a user generated content platforms makes any sense whatsoever (as far as I can tell, no one does what Guilbeault and the government want to mandate. Some have pointed to Pakistan, which has extensive regulations, but they appear to primarily target content blocking rather than government-mandated content prioritization).
Guilbeault has frequently (and misleadingly) claimed that Bill C-10 is similar in approach to European Union regulation of audio-visual services. This is claim is false as I discuss in this post. But it should be noted that even the European Union approach – which involves considerable regulation – does not contemplate creating domestic content requirements for user generated content. Indeed, the directive explicitly treats audiovisual media services (such as Netflix) and video sharing platform services (such as Youtube) differently. Audiovisual media services that engage in curating content face content requirements similar to those found for conventional broadcasters. Video sharing platform services face rules with respect to removing certain illegal or harmful content, but there are no quotas or no positive obligations to prioritize some content over others.
Not only is such an approach unworkable (how do regulators even identify what counts as domestic user generated content), but it would represent an exceptionally heavy-handed regulatory approach where a government-appointed regulator decides what individual user generated content is prioritized in order to further “discoverability”, a term that isn’t even defined in Bill C-10. There is a need for greater transparency of the algorithms used by social media companies, but to turn over the content choices of social media feeds of millions of Canadians to the CRTC is madness and an abdication of the government’s professed support for freedom of expression.