In Malaysia, age verification and child online safety have increasingly become part of the government’s regulatory agenda as authorities respond to concerns about children’s exposure to harmful content, cyberbullying, sexual exploitation and scams on social media and other digital platforms. In late 2025 the Malaysian Parliament passed the Online Safety Act 2025, which came into force on 1 January 2026 and empowers the Malaysian Communications and Multimedia Commission (MCMC) to regulate online services, require licenses for major digital platforms and enforce new age-related safeguards. Under this framework, the government has made clear that identity verification and age checks will be required on social media platforms to ensure that under-age users do not create or maintain accounts, with electronic Know Your Customer (eKYC) standards using official documentation such as MyKad, passports or the national MyDigital ID system being central to the approach. Officials have indicated that platforms must be prepared to implement these identity checks and age verification mechanisms as part of their compliance with licensing and safety obligations.
The Communications Minister has announced that Malaysia will raise the minimum age for social media use to 16[1], reflecting growing concern over the impact of social platforms on young people’s wellbeing. This policy shift is part of subsidiary regulations being drafted under the Online Safety Act, intended to prevent children under 16 from accessing social media and to ensure that content delivered to those under 18 is age-appropriate. Platforms will be required to adopt age verification systems that block access for younger users, and the MCMC is expected to engage with major tech companies such as Meta, Google and TikTok to design practical mechanisms aligned with data protection safeguards.
Malaysia has also pushed specific companies on age verification. The government has pressured TikTok to strengthen its age checks to better protect minors from harmful content[2], summoning executives and urging cooperation between regulators and police to improve safety measures. Authorities have repeatedly signalled that digital platforms with significant user bases must do more to verify users’ ages to comply with licensing requirements and to reduce children’s exposure to inappropriate material.
Beyond social media, the government is considering extending age verification discussions to online gaming and other interactive applications, exploring licensing regimes and identity checks for these services as well. Officials have highlighted the role of digital IDs and eKYC systems in confirming age and identity for a range of online services, while also emphasising the importance of parental responsibility and broader digital safety education.
Under the Communications and Multimedia Act and the Online Safety Act 2025, the Malaysian Communications and Multimedia Commission may issue binding compliance directions, impose administrative fines, order the suspension or blocking of non-compliant services, and revoke platform licences where persistent breaches occur. This gives age assurance and child safety obligations tangible regulatory consequences rather than relying on voluntary adoption.
Implementation of age assurance is progressing through technical evaluation as well as policy design. MCMC has established a regulatory sandbox to test age verification and age estimation technologies with selected service providers[3]. These pilots are intended to assess reliability, fraud resistance, privacy safeguards, data minimisation practices and user experience before formal technical standards or certification schemes are issued.
Additional child-specific data protection measures are also emerging in policy discussions. These include limitations on targeted advertising to minors, restrictions on behavioural profiling of child users and expectations that recommendation or discovery algorithms should not amplify harmful content to children. These measures complement age assurance by addressing downstream risks even where access controls are in place.
Overall, Malaysia’s approach reflects a trend toward more rigorous age checks and controls in the digital environment, with age verification increasingly framed as a regulatory requirement for platforms under the Online Safety Act, tied to identity verification systems and higher minimum age limits for social media use.
[1] https://www.reuters.com/world/asia-pacific/malaysia-says-it-plans-ban-social-media-under-16s-2026-2025-11-24/
[2] https://www.reuters.com/business/media-telecom/malaysia-pushes-tiktok-age-verification-protect-minors-2025-09-04/
[3] https://www.komunikasi.gov.my/en/public/news/27919-fahmi-government-to-roll-out-social-media-regulatory-sandbox-from-jan-1