Canada does not yet have a single overarching national online safety law. Instead, age assurance expectations arise from a combination of sector-specific regulation, privacy law, and emerging legislation, most notably bills S-209 (Protecting Young Persons from Exposure to Pornography Act) and C-63 (Online Harms Act), if enacted, these laws will establish a statutory framework for online safety and age assurance.
National legal framework and regulators
Canada’s federal framework currently relies on existing statutory regimes. Key pillars include:
- Criminal law (through the Criminal Code) governing child sexual exploitation, obscene material, and luring offences.
- Telecommunications and broadcasting regulation under the Broadcasting Act and Telecommunications Act, enforced by the Canadian Radio-television and Telecommunications Commission (CRTC), particularly for traditional broadcast and some online audio/visual services.
- Privacy law under Personal Information Protection and Electronic Documents Act (PIPEDA) in many provinces and federally, and corresponding provincial privacy statutes, which shape how platforms can collect and use personal and age-related data.
- Competition and consumer protection law, which can intersect with age-related disclosures and unfair practices.
Age and content restrictions in current law
Canada’s Criminal Code contains strict prohibitions on making child pornography available, and on material “harmful to minors,” and includes offences for luring a child (communicating with intent to facilitate sexual activity) and child sexual exploitation. These laws do not specify age-verification mechanisms, but they create a legal backdrop where platforms can be held liable for profiting from or facilitating access by minors to illegal content.
Privacy regime and child data
Under PIPEDA and provincial privacy law, personal data, including data that could identify age, must be collected, used and disclosed with appropriate legal bases, notice, and safeguards. Privacy regulators have signalled that age assurance systems must be feature-specific, proportionate and privacy-preserving; unnecessary or overly intrusive data collection for purely age-assurance purposes can be problematic.
Criminal and civil enforcement may overlap where platforms host or facilitate access to content that is illegal to distribute to minors.
Emerging law: Bill S-209
Bill S-209, is the most significant pending federal legislation directly addressing online safety and age assurance. It was introduced in the Senate in 2023 with the stated purpose of:
- Requiring large online platforms to take steps to prevent children’s exposure to harmful material;
- Creating enforceable safety duties;
- Establishing an independent online safety commissioner and administrative enforcement regime; and
- Providing a statutory basis for age-assurance obligations.
Key proposals in S-209 relevant to age assurance include:
Safety duties: Platforms would be required to take reasonable steps to mitigate exposure of users, especially children, to harmful content, including through product design (for example settings defaults and age gating). The duty is broadly framed and would be enforced by an online safety commissioner.
Age assurance: Bill S-209 contemplates that platforms would be required to take steps to limit access by minors to content inappropriate for them, which would logically include auditable age verification where content is age-restricted. The bill does not, at this stage, prescribe one specific technology, but it would authorise regulation of age-assurance obligations by the safety regulator.
Reporting and transparency: The draft framework would mandate transparency reporting and risk assessments, including potentially reporting on how the platform determines whether users are children.
Enforcement: S-209 would create administrative orders, fines, and other penalties for non-compliance, with escalating sanctions for failures to protect children.
S-209 is not yet enacted, and significant regulatory development would be needed to transform the broad duties into detailed age-assurance requirements. However, it represents the strongest signal to date that Canada intends to move towards statutory online safety regulation with age assurance at its core.
Emerging Law: Bill C-63
Bill C-63 is a proposed federal law, introduced in Feburary 2024 and being revived in the 2026 session, that would establish Canada’s first comprehensive statutory framework for addressing harmful online content. It combines platform regulation, criminal law amendments, and human-rights enforcement. Its stated objectives include:
-
Requiring regulated online services to act responsibly to reduce users’ exposure to defined categories of harmful content, including child sexual exploitation material, non-consensual intimate images, content that incites violence or terrorism, and content that foments hatred;
-
Imposing duties on platforms to make certain harmful content inaccessible once they become aware of it or receive a valid user complaint, with strict timelines for removal in specific cases;
-
Creating a new Digital Safety Commission of Canada, supported by a Digital Safety Ombudsperson, with powers to receive complaints, conduct investigations, issue compliance orders, and impose significant administrative monetary penalties;
-
Amending the Criminal Code to strengthen hate-related offences, including higher maximum penalties and new preventive peace bond measures for individuals at risk of committing hate-motivated crimes;
-
Amending the Canadian Human Rights Act to allow individuals to bring complaints regarding online hate speech likely to incite vilification of protected groups; and
-
Expanding mandatory reporting obligations for internet service providers to include transmission data when reporting online child sexual exploitation material to law enforcement.
Bill C-63 is currently being revived in the 2026 parlimentary session. While it does not mandate proactive content monitoring or directly regulate private messaging services, it represents a major shift toward formal, enforceable online safety obligations in Canada, with potentially significant implications for platform governance, content moderation practices, and compliance risk.
Sector-specific regulation and age assurance
In the absence of a unified online safety law, some Canadian provinces and sectors have their own frameworks with age implications:
• Broadcasting and video services: The CRTC extends broadcast-like regulatory obligations to online video services operating in Canada. Age ratings and access controls are used to manage under-age access to adult or harmful audiovisual content.
• Telecommunications interception and access law: While not age-specific, some enforcement action in other contexts hinges on knowing the age of users who access certain services.
• Provincial child protection law: Canadian provinces have their own child protection statutes and reporting duties that bind service providers in some contexts.
What this means for service providers
At present, online age assurance in Canada is not driven by a single, prescriptive national age-verification law. Instead, it is shaped by:
• the Criminal Code’s prohibitions on distributing illegal or harmful material to minors;
• privacy law constraints and expectations around age-related data; and
• the expectations and emerging duties articulated in Bill S-209, which would create a statutory online safety framework if enacted.
In practice, platforms accessible in Canada should:
• implement privacy-preserving age assurance where content is harmful or illegal for minors;
• integrate age controls into product design and default settings; and
• be ready to produce auditable evidence of steps taken to block under-age access in line with emerging regulatory trends.
Looking ahead
Canada’s age assurance landscape is in transition. Bill S-209 would, if enacted, establish a statutory online safety regime with enforceable duties that likely include age verification and age gating. Even before enactment, privacy and criminal law already discourage under-age access to harmful content and require platforms to be able to demonstrate that minors have been excluded from such content. As the legislative process continues, platforms should monitor developments closely and prepare to integrate evidence-based, proportionate age-assurance controls into their design and compliance programmes.
if enacted, establish a statutory framework for online safety and age assurance.