Current Region:
Global

Malaysia

Age verification and child online safety are part of Malaysia’s regulatory agenda, driven by concerns about children’s exposure to harmful content, cyberbullying, sexual exploitation and scams. In late 2025, the Malaysian Parliament passed the Online Safety Act 2025, which came into force on 1 January 2026.

The Act empowers the Malaysian Communications and Multimedia Commission (MCMC) to regulate online services, require licences for major digital platforms and enforce age-related safeguards. The framework emphasises identity verification and age checks which may involve eKYC methods such as national ID or passport checks.

Officials have indicated that platforms must implement identity and age verification mechanisms as part of licensing and safety compliance.

Social Media

Minimum Age Requirements

The Communications Minister has announced that Malaysia will raise the minimum age for social media use to 16[1]. Subsidiary regulations under the Online Safety Act are expected to:

  • Prevent children under 16 from accessing social media platforms

  • Require age-appropriate content controls for users under 18

  • Require platforms to deploy age verification systems to block under-age users

The MCMC is expected to engage with major technology companies, including Meta, Google and TikTok, to develop implementation mechanisms aligned with data protection safeguards.

Platform Enforcement and Compliance

The Malaysian government has taken direct action with specific platforms. For example:

  • Authorities have pressured social meida platforms to strengthen age verification systems[2]

  • Platform executives have been summoned to cooperate with regulators and law enforcement

  • Platforms with large user bases are expected to implement stronger age verification as part of licensing obligations

Adult and Harmful Content Controls

Age assurance requirements are linked to preventing children from accessing harmful or inappropriate online material. Under the Online Safety Act framework, platforms are expected to:

  • Implement safeguards to restrict minors’ access to harmful or age-inappropriate content

  • Use identity and age verification systems to support access controls

  • Ensure content provided to users under 18 is age-appropriate

These requirements are connected to broader child protection and online safety objectives.

Chatbots, AI and Interactive Applications

Malaysia is considering extending age verification and identity assurance measures to additional digital services. Policy discussions indicate that these may include:

  • Interactive online applications

  • AI-driven or conversational services operating as part of regulated online platforms

  • Services that provide user-generated or algorithmically delivered content

Online Gaming and Other Digital Services

The government is also examining age verification requirements for online gaming and similar services. Discussions include:

  • Potential licensing regimes for gaming and interactive platforms

  • Application of identity verification and age assurance tools

  • Expansion of compliance obligations beyond social media services

Authorities have also emphasised parental responsibility and digital safety education as complementary measures.

Enforcement Powers and Regulatory Oversight

Under the Communications and Multimedia Act and the Online Safety Act 2025, the MCMC has enforcement powers, including the ability to:

  • Issue binding compliance directions

  • Impose administrative fines

  • Order suspension or blocking of non-compliant services

  • Revoke licences in cases of persistent breaches

These powers make age assurance and child safety obligations legally enforceable rather than voluntary.

Technical Implementation and Standards

The MCMC has established a regulatory sandbox to test age assurance technologies with selected service providers[3]. The sandbox is intended to evaluate:

  • Reliability and accuracy of age verification and age estimation technologies

  • Resistance to fraud and circumvention

  • Privacy and data minimisation safeguards

  • User experience and accessibility

The results are expected to inform future technical standards or certification schemes.

Child Data Protection and Algorithmic Safeguards

Additional child-focused data protection measures are being discussed, including:

  • Restrictions on targeted advertising to minors

  • Limitations on behavioural profiling of child users

  • Expectations that recommendation and discovery algorithms do not promote harmful content to children

These measures are intended to complement access controls and reduce risks after user access is granted.


[1] https://www.reuters.com/world/asia-pacific/malaysia-says-it-plans-ban-social-media-under-16s-2026-2025-11-24/

[2] https://www.reuters.com/business/media-telecom/malaysia-pushes-tiktok-age-verification-protect-minors-2025-09-04/

[3] https://www.komunikasi.gov.my/en/public/news/27919-fahmi-government-to-roll-out-social-media-regulatory-sandbox-from-jan-1