Current Region:
Global

Australia

Australia now has one of the most operationalised age assurance regimes in the democratic world. Age assurance is driven through the Online Safety Act 2021 and its delegated instruments, enforced by the eSafety Commissioner. Two strands matter most in practice: mandatory industry codes for access to online pornography and other high impact adult material and the Social Media Minimum Age obligation (SMMA), which requires age-restricted social media platforms to take reasonable steps to prevent under 16s from holding accounts.

Australia’s system operates as a layered stack:

  • Basic Online Safety Expectations set baseline government expectations for safety governance across a wide range of services
  • Industry codes impose more specific obligations, including age-gating and access controls for certain material, across multiple parts of the online ecosystem
  • Social Media Minimum Age Act adds a separate, platform-account level obligation to prevent under 16s from maintaining accounts on age-restricted social media platforms

Australia has moved beyond policy statements into enforceable age assurance duties, with a clear implementation timetable which concludes in March 2026.

National legal framework and regulators

Australia’s core online safety framework is the Online Safety Act 2021 (OSA). The primary regulator is the eSafety Commissioner, who administers the OSA’s online content scheme, issues notices and directions, and can pursue civil penalties where obligations are not met.

Alongside the OSA, Australia’s privacy regulator, the Office of the Australian Information Commissioner (OAIC), has published guidance on the SMMA and related privacy expectations, which is relevant where age checks involve identity evidence or biometric processing.

Basic Online Safety Expectations

A distinctive feature of Australia’s model is the Basic Online Safety Expectations (BOSE), set by ministerial determination under the OSA. BOSE is not itself a prescriptive technical standard, but it establishes government expectations that regulated services take reasonable steps to keep Australians safe, with eSafety able to request information about how providers are meeting those expectations. eSafety updated its BOSE Regulatory Guidance in December 2025. (eSafety Commissioner)

Industry codes for age-restricted material
Australia’s most direct age assurance obligations sit in the Online Safety industry codes for Class 1C and Class 2 material, aimed at limiting access to online pornography and other high impact adult content.

The codes are published and tracked through eSafety’s Register of Online Safety Codes and Standards. In summary:

  • The Age-Restricted Material Codes for hosting services, internet carriage services and internet search engine services were registered on 27 June 2025 and measures commence from 27 December 2025, with some later commencements depending on the measure and service category
  • The remaining six codes, including social media services (core features), social media services (messaging features), relevant electronic services, designated internet services, app distribution platforms and equipment providers, were registered on 9 September 2025 and begin coming into effect from 9 March 2026 onwards

These codes matter for age assurance because they are the main pathway for requiring services to implement age-gating and access controls for adult material, including where content is distributed through search, hosting, apps and platform features rather than on standalone adult sites.

Social Media Minimum Age obligation (SMMA)
Australia has introduced a statutory minimum age framework for social media accounts, commonly referred to as the Social Media Minimum Age (SMMA). Government guidance states that, from 10 December 2025, “age-restricted social media platforms” must take reasonable steps to prevent Australians under 16 from creating or keeping accounts.

The enabling legislation is the Online Safety Amendment (Social Media Minimum Age) Bill 2024, which amends the Online Safety Act 2021 to establish the minimum age and the “reasonable steps” obligation.

SMMA does not mandate a single method or require universal ID checks for every user. Instead, it creates an outcome test: platforms must be able to show they are taking reasonable steps, proportionate to risk, to prevent under 16s from holding accounts. This pushes platforms towards auditable age assurance that can withstand regulator scrutiny while meeting privacy and proportionality expectations.

Where Australia is with implementation?

The implementation picture is mixed by sector:

  • Codes covering hosting, carriage and search began commencing from 27 December 2025
  • Platform-facing codes for social media, app distribution, relevant electronic services, designated internet services and equipment providers are scheduled to begin from 9 March 2026 onwards
  • BOSE remains in effect as the broader expectations layer and eSafety’s regulatory guidance was refreshed in December 2025
  • SMMA came into force in December 2026, but with only ten major social media platforms in scope.  Others may be added in future.