April 30, 2024
Bill C-63, if passed, will create the hotly anticipated Online Harms Act to regulate certain online platforms, create new Criminal Code of Canada offences and restore “communication of hate speech” to the Canadian Human Rights Act. Tabled on February 26, 2024, Bill C-63 follows a 2021 proposal and consultation process and accompanying technical paper and was roundly criticized as problematic in a number of ways – but principally with respect to the Charter of Rights and Freedoms right to freedom of expression. The government subsequently created an expert advisory panel to assist to compile feedback. Three years later, Bill C-63 looks very different from the original proposal and consultation. Here are 10 key facts about Canada’s proposed Online Harms Act and its impact on regulated social media services.
1. Focused Application
Instead of going after a very long list of online operators, the Online Harms Act will be much more focused on social media companies. However, the federal Minister of Justice can add entities and types of entities to the scope of the Act by regulation. In the meantime, the Act will apply to social media companies that meet a particular threshold that the regulations (which aren’t yet available) will set out, and defines “social media service” as, “a website or application that is accessible in Canada, the primary purpose of which is to facilitate interprovincial or international online communication among users of the website or application by enabling them to access and share content.” The Act’s definition also expressly includes:
The Act expressly excludes social media services that don’t permit a user to communicate to the public and carves private messaging features out of its scope.
2. New Bureaucracy
Like the proposed Personal Information and Data Protection Tribunal Act, the Online Harms Act will create a whole new regulatory structure for managing and enforcing the Act that includes the:
3. Operator’s Duty to Act Responsibly
The Online Harms Act will impose “a duty to act responsibly” on regulated service operators with respect to seven categories of designated “harmful content” by implementing processes and mitigation measures that the Digital Safety Commissioner must approve.
“Harmful Content”. Much of the Act will turn on the question: what is “harmful content”? The Act includes seven different categories of content:
The Act defines each category. Notably, the definition of “intimate content communicated without consent” in the Act is broader than that in the Criminal Code related to the non-consensual distribution of intimate images. In particular, the Act’s definition is expanded to include intimate “deepfakes”: images depicting a person in an explicit manner that are either modifications of existing photographs or videos or are completely synthetic as the result of someone’s imagination or with use of artificial intelligence.
Special Treatment. While there are seven categories of “harmful content”, the Act treats two categories – intimate content communicated without consent and content that sexually victimizes a child or revictimizes a survivor – differently. This is an apparent response to the general critique of the 2021 consultation and the nonconsensual distribution of intimate images and depictions of child sexual abuse are already illegal in Canada and generally easier to identify. But the other five categories of “harmful content” raise greater freedom of expression concerns. The measures required in relation to the remaining five categories are open-ended.
Flagging & Blocking. The Act will require regulated service operators to put in place a flagging mechanism so users can flag any harmful content – but requires they take specific action in relation to intimate content communicated without consent and content that sexually victimizes a child or revictimizes a survivor:
The Act doesn’t mandate a process to addressing the other categories of harmful content. Presumably an operator will need to elaborate on that process in its digital safety plan.
Resource Person. Service operators must also make a “resource person” available to users to hear concerns, direct them to resources and give guidance on the use of those resources.
4. Mandatory Digital Safety Plan
Each regulated service operator must submit a “digital safety plan” to the Digital Safety Commission for approval.
Plan Contents. The required contents of the plan are extensive and requires the operator to set out what it does to comply with the Act, including:
The operator must also publish most of this on its platform.
Access By Accredited Organizations. The Digital Safety Commission can accredit organizations (not individuals) to access electronic data in digital safety plans that are submitted to the Commission but aren’t required to be published. To be accredited, the organization must be conducting research, education, advocacy or awareness activities related to the Act’s purposes. The Commission can grant access to this non-public data and suspend or revoke accreditation if the accredited organization doesn’t comply with the accreditation conditions. Accredited organizations can also request access to electronic data in digital safety plans from regulated service operators and the Commission can order that the operator provide the data. However, this access is only allowed for research projects related to the Act‘s purposes. This is another area where the parameters are left to the as yet unavailable regulations. There is no explicit requirement that the accredited researcher have their research approved by a Canadian research ethics board.
5. Digital Safety Commission Complaints Process
Any person in Canada can make a complaint to the Commission that content on a regulated service is intimate content communicated without consent or sexually victimizes a child or revictimizes a survivor. Upon receipt, the Commission must conduct an initial assessment of the complaint and handle it as per the process in the Act – a process that lacks many of the traditional fairness or natural justice safeguards.
Triviality Test. The threshold for an immediate takedown order is a low one: if the Commission is of the opinion the complaint is trivial, frivolous, vexatious, made in bad faith, or has otherwise been dealt with, it must dismiss it. But if the Commission decides – without any substantial consideration of the merits of the complaint – it’s not a non-trivial complaint, it must:
User Information. The operator must ask the user that posted the content whether they consent to the provision of their contact information to the Commission. If the user consents, the operator must provide the contact information to the Commission – but it seems unlikely that many users, who are effectively accused of sharing illegal content, will consent to their information being shared with the Commission. If the user doesn’t consent, the Commission’s review will proceed without the user’s input.
Representations. Only at this point must the Commission give the complainant and the user an opportunity to make representations respecting whether the content fits into either the intimate content communicated without consent or sexually victimizes a child or revictimizes a survivor categories of harmful content.
Final Decision. Again, the threshold is low. The Commission must then decide whether there are “reasonable grounds to believe” the content fits into one of these two categories of harmful content – that’s it. If it decides there are such reasonable grounds the Commission must issue an order that the content be made permanently inaccessible to all persons in Canada. In contrast, a criminal court would be required to consider whether the content fits the definition beyond a reasonable doubt; a civil court would have to consider whether the content fits the definition on a balance of probabilities.
6. The Digital Safety Commission’s Vast Powers
The Commission’s investigative and penalizing powers are broad – and have teeth.
Powers. The Commission has the power to:
Hearing. It’s entirely in the Commission’s discretion to determine when a hearing is appropriate. If it does holds a hearing, it’s required to hold it in public – unless it isn’t. The Act sets out several circumstances in which the Commission can hold a hearing, in whole or in part, behind closed doors.
Rules of Evidence. While investigating and holding hearings, the Commission isn’t bound by the traditional legal or technical rules of evidence that apply to courts and many administrative tribunals: the Commission is required to “deal with all matters that come before it as informally and expeditiously as the circumstances and considerations of fairness and natural justice permit.” This raises further concerns about due process and procedural fairness, especially given the Commission’s punitive powers.
Compliance Orders. The Act will give the Commission broad powers to issue “compliance orders” – without the requirement of a hearing or a clear path for appeals under the Act. To grant a compliance order, all the Commission needs is “reasonable grounds to believe” a regulated service operator has contravened the Act. The operator gets no opportunity to hear the concerns, make submissions or respond before the Commission decides whether to issue an order. And what the Commission can order is largely without limits: it can make an order requiring the operator to take – or to stop – any measure to ensure compliance.
7. Enormous Penalties
The penalty for contravening such a Commission’s compliance order is enormous: up to the greater of $25M or 8% of the regulated service operator’s global revenue – an amount that could be in the billions considering large social media companies’ 2023 revenues.
8. Designated Inspectors
The Digital Safety Commission has the authority to designate “inspectors” who the Commission considers qualified for the purposes of verifying compliance or preventing non-compliance with the Act. Inspectors have authority to enter and search any premises other than a dwelling without a warrant – and without notice.
Powers. Once in the business’s premises, an inspector can:
Assistance. Inspectors can require any person in charge of the premises to assist them and provide documents, information and any other thing. And inspectors can bring along anybody else they think is necessary to help them exercise their powers or perform their duties and functions.
Obligation. The Act will also include a separate, standalone power for an inspector to order anyone to provide information or documents or to access information or documents for a purpose related to verifying compliance or preventing non-compliance with the Act.
9. Age-Appropriate Design Code
Regulated operators might also be required to comply with “design features” set out in the Act. The Act will require regulated service operators to integrate into their service any design features respecting the protection of children, like age-appropriate design, set out in its regulations. This obligation gives the government the power to regulate potentially huge changes to or required elements of an online service – and could hint at a mandatory “age appropriate design code”. The basis for these requirements aren’t yet known, but before developing the U.K.’s age-appropriate design code, the U.K. Information Commissioner’s Office carried out massive amounts of consultation, research and discussion.
10. Changes to Other Laws
In addition to implementing the Online Harms Act, Bill C-63 will amend the Criminal Code and the Canadian Human Rights Act, changes that have largely taken over all discussion of the Bill both online and in the media due to their controversial nature.
Hate Crime. Bill C-63 would amend the Criminal Code to create a new hate crime, “offence motivated by hatred”, and to impose harsher penalties for hate propaganda offences.
Discrimination. Bill C-63 would also amend the Canadian Human Rights Act to effectively reinstate “communication of hate speech” as a discriminatory practice. This would give aggrieved individuals the right to file a complaint with the Canadian Human Rights Commission which, in turn, can impose monetary penalties of up to $20,000. However, these changes concern user-to-user communication and not social media platforms, broadcast undertakings, or telecommunication service providers.
Mandatory Reporting. Bill C-63 further introduces amendments to the existing An Act Respecting the Mandatory Reporting of Internet Child Pornography by Persons who Provide an Internet Service related to the mandatory reporting of child sexual abuse materials. The amendments clarify the definition of “internet service” to include access, hosting and interpersonal communication like email. Any person providing an “internet service” to the public must send all notifications to a designated law enforcement body. Additionally, the amendments would extend the preservation period for data related to an offence and where the materials at issue are “manifestly child pornography”, the service provider will be required to include more information in its report.
Please contact your McInnes Cooper lawyer or any member of our Privacy, Data Protection & Cyber Security Team @ McInnes Cooper to discuss how to prepare to comply with the Online Harms Act.
McInnes Cooper has prepared this document for information only; it is not intended to be legal advice. You should consult McInnes Cooper about your unique circumstances before acting on this information. McInnes Cooper excludes all liability for anything contained in this document and any use you make of it.
© McInnes Cooper, 2024. All rights reserved. McInnes Cooper owns the copyright in this document. You may reproduce and distribute this document in its entirety as long as you do not alter the form or the content and you give McInnes Cooper credit for it. You must obtain McInnes Cooper’s consent for any other form of reproduction or distribution. Email us at [email protected] to request our consent.
Oct 29, 2024
On September 9, 2024, a unanimous Federal Court of Appeal decided consent is to be determined on an objective standard. In an unusual move, in…
Aug 15, 2024
On June 21, 2024, the Supreme Court of Canada concluded – decisively - that the Canadian Charter of Rights and Freedoms applies to protect the…
Jul 16, 2024
The Canadian Security Intelligence Service (CSIS) has been looking for a new production order power; it’s on its way. The role of CSIS is to…
Jun 26, 2024
An increasing number of municipalities in Canada are using public video camera surveillance to promote public safety and help deter crimes like…
Jun 20, 2024
On April 30, 2024, the Ontario Divisional Court decided the victim of a serious cyber security incident was required to produce to privacy…
Mar 14, 2024
On March 1, 2024, the Supreme Court of Canada decided a police request for disclosure of an IP address is a “search” under section 8 of the…
Dec 15, 2023
Over four years after it began, the federal government still hasn’t finalized its overhaul of the private sector privacy law regime that both…
Sep 25, 2023
There’s a new scam on the web: Electronic Fund Transfer (EFT) scams. Most are familiar with established scams like phishing and ransomware and…
Aug 10, 2023
Canada’s first Tech Talent Strategy aims to aggressively attract tech talent to “fuel innovation and drive emerging technologies forward”.…
Jun 9, 2023
You arrive at the legendary Madison Square Garden to catch the Mariah Carey concert. It’s the big event of the trip – the reason you came to…
Apr 27, 2023
The benefits to employees, and often to employers, of remote work has made it a staple of today’s workplace. But the move to remote work…
Feb 1, 2023
On January 26, 2023, the Office of the Privacy Commissioner of Canada (OPC) released a report of findings requiring companies using targeted…
Jan 26, 2023
In November 2022, the Ontario Court of Appeal definitively decided an organization whose information systems are breached by a malicious third…
Jul 20, 2022
There’s a new privacy law coming to Canada. In June, the federal government introduced a complete overhaul of the privacy law regime that both…
Jun 30, 2022
On June 16, 2022, the federal government took a second shot at a complete overhaul of the private sector privacy law regime that both protects…
May 20, 2022
On May 22, 2010 (affectionately known as “Bitcoin Pizza Day”), a Floridian bought two Papa John's pizzas with Bitcoin. The day is famous…
Jan 25, 2022
More and more people are using smart contracts: the global smart contracts market was valued at USD $145M in 2020; it’s projected to be valued…
Dec 16, 2021
Updated October 7, 2024. The name of the game is to have a plan to mitigate the risk that a data breach will happen – but be ready when it…
Jan 26, 2021
Updated March 4, 2022. Privacy is critical to every business in every sector, including startups and growing businesses: to comply with the…
Dec 2, 2020
Using social media influencers and micro-influencers is an increasingly effective marketing strategy. Social media use is pervasive; 94% of…
Nov 19, 2020
We updated this publication on June 30, 2022. NOTE: On June 16, 2022, the Government of Canada introduced Bill C-27: Digital Charter…
Nov 17, 2020
We updated this publication on July 11, 2023. Spurred by the COVID-19 Pandemic and bricks-and-mortar closures, businesses – from SMEs to…
Aug 12, 2020
This publication has been updated as of May 5, 2021. The ongoing COVID-19 pandemic has led many employees to continue working from home, by…
Jul 6, 2020
On June 26, 2020, the Supreme Court of Canada released Uber Technologies Inc. v. Heller, a much-awaited decision regarding the enforceability of…
Jun 12, 2020
The financial technology (Fintech) industry uses technology to support and enhance financial and banking services.
Mar 28, 2019
Organizations subject to Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA) – those that collect, use or…
Feb 20, 2019
On February 14, 2019, the Supreme Court of Canada decided yet another criminal law decision that will likely have broader ramifications for…
Dec 19, 2018
On December 13, 2018, the Supreme Court of Canada confirmed that a third party can’t waive a person’s right to privacy or their rights under…
Aug 20, 2018
Updated July 8, 2024. Every organization subject to Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA, soon to…
Aug 3, 2018
Updated June 28, 2024. As of November 1, 2018, organizations in Canada subject to the Personal Information Protection and Electronic…
Jul 18, 2018
Most businesses – from startups to SMEs to multi-nationals, and from private family-owned businesses to public corporations – will use…
Jun 13, 2018
Updated September 26, 2024. Businesspeople (and their legal counsel) are on the road more than ever before: according to Statistics Canada,…
Apr 2, 2018
Equity compensation plans are a valuable and versatile tool for many corporations, from early-stage startups to established blue-chips.…
Jan 12, 2018
Whether a provincial court will grant police a “production order” under the Criminal Code of Canada requiring a non-Canadian company to…
Nov 16, 2017
Corporations are the leading business vehicle in modern commerce. For startups, properly structuring and incorporating is critical to avoid…
Oct 31, 2017
Intellectual Property (IP) can be a valuable asset – even the most valuable asset – of a business. So it’s worth making sure the business…
Jul 17, 2017
A corporation does not always sail in calm or safe waters. Cash shortages, unattainable or unmet goals, Board disagreements over the best course…
Jul 13, 2017
When growing your business, you face many decisions, including choosing the business structure that is right for you. Your legal team can be…
Jun 28, 2017
On June 28, 2017, the Supreme Court of Canada confirmed a Canadian court can issue an interlocutory injunction (an order requiring an entity or…
Jun 23, 2017
On June 23, 2017, the Supreme Court of Canada decided that in a contest between the choice of forum clause in Facebook’s online terms of use…
Jun 7, 2017
On June 7, 2017, the federal government repealed the regulations that would have brought into effect the sections of Canada’s Anti Spam…
Mar 30, 2017
Social media platforms, like Instagram, Twitter, LinkedIn, YouTube, Facebook and GooglePlus, arguably have more followers and are more closely…
Feb 24, 2017
Updated January 29, 2024. Most organizations (72%) store the personal information of customers. employees, suppliers, vendors or partners,…
Jan 25, 2017
Doing business with the public sector creates an often overlooked – but very real – risk that the confidential information a business…
Dec 7, 2016
Updated February 7, 2024. We live in a world of change. New ideas and new industries are rapidly developing and the list keeps growing: tidal…
Nov 22, 2016
On November 17, 2016 the Supreme Court of Canada decided a mortgagee has the mortgagor’s implied consent to disclose its discharge statement…
Oct 19, 2016
We updated this publication on January 17, 2023. For many businesses, large and small, their “Intellectual Property” (IP) is one of their…
Oct 19, 2016
Business owners wear many hats – including employer. Your employees may be your business’s greatest asset, but they could also be your…
May 10, 2016
This publication has been updated as at April 18, 2022. Access to sufficient capital is always a business issue, from the startup stage right…
Mar 24, 2016
When a business responds to a public sector Request for Proposal or Expression of Interest (both of which we’ll refer to as an RFP for these…
Jan 27, 2016
On January 21, 2016, the Ontario Superior Court of Justice dramatically expanded the scope of legal privacy protection – and the liability…
Mar 25, 2015
On March 3, 2015 Canada’s Privacy Commissioner determined that Health Canada breached privacy laws by mailing letters to over 40,000 Marihuana…
Mar 6, 2015
On March 5, 2015, the Canadian Radio and Television Commission (the CRTC, the main agency charged with administering and enforcing most of CASL)…
Dec 11, 2014
On December 11, 2014 the Supreme Court of Canada continued its trend to recognize privacy rights – and develop the law to protect them –…
Dec 11, 2014
On January 15, 2015, the software provisions of Canada’s Anti-Spam Legislation (CASL) will take effect. CASL’s anti-spam sections, touted…
Dec 1, 2014
The construction industry - project owners, contractors, subcontractors and trades - might be relaxing, ignoring the hype around Canada’s…
Oct 14, 2014
CASL’s anti-spam sections came into force on July 1, 2014. Every organization that CASL affects should now be complying with it – and their…
Aug 1, 2014
Most Canadians have heard about Canada’s Anti-Spam Legislation (CASL): we’ve been bombarded with “CASL Compliant” emails asking us to…
Jun 16, 2014
On June 13, 2014 the Supreme Court of Canada decided that Canadians have a reasonable expectation of privacy in their online activities, and…
Jun 12, 2014
The countdown to CASL is almost over: there are only 13 business days until the anti-spam provisions of CASL – and most of the penalties for…
May 8, 2014
On July 1, 2014 – less than two months from now - the anti-spam sections of Canada’s Anti-Spam Legislation (CASL) take effect. Individuals…
Apr 15, 2014
The countdown to CASL is on: on July 1, 2014, the anti-spam sections of Canada’s Anti-Spam Legislation (“CASL”) take effect. Individuals…
Feb 28, 2014
On July 1, 2014, the anti-spam sections of Canada’s Anti-Spam Legislation (aka “CASL”) will take effect. CASL is: Broad. It applies…
Feb 28, 2014
On July 1, 2014, the anti-spam sections of Canada’s Anti-Spam Legislation (aka “CASL”) take effect. CASL will apply to just about every…
Nov 8, 2013
On November 7, 2013, the Supreme Court of Canda decided police require specific authorization in a search warrant to search the data in a…
Nov 28, 2012
On October 19, 2012 the Supreme Court of Canada (SCC) decided a teacher criminally charged with possession of child pornography and unauthorized…
Subscribe to McInnes Cooper to stay current with our leading insights on legal updates, trends, news, events, and services.