Current Region:
Global

Singapore

Singapore operates a structured and proactive online safety regime, characterised by sector-specific regulation rather than a single horizontal platform liability statute. Child protection online is primarily driven through broadcasting and media regulation, backed by strong enforcement powers and clear expectations regarding age-restricted content. While Singapore does not prescribe a single approved age-verification method, age assurance is an established risk-mitigation control expected where services expose users to adult or harmful material. IMDA’s approach focuses on platform accountability and effective safeguards rather than mandating user identification or a single technical solution.

The Infocomm Media Development Authority (IMDA) oversees this landscape, regulating social media, video-sharing, and app distribution services under the Broadcasting Act and subsidiary legislation. IMDA can require access controls on age-restricted content, issue binding directions to platforms, and impose financial penalties for non-compliance.

Codes of Practice and App Store Gatekeeping

Singapore’s framework was significantly strengthened by the Online Safety (Miscellaneous Amendments) Act, which introduced binding Codes of Practice for Online Safety for designated online communication services. These Codes require platforms to mitigate children’s exposure to harmful content, provide user-reporting and parental controls, and implement effective access controls where risks are foreseeable. They embed a safety-by-design and risk-proportionate compliance model, requiring platforms to assess foreseeable harms and implement safeguards proportionate to risk.

On 31 March 2025,  the Code of Practice for App Distribution Services took effect.took effect. This Code requires designated app distribution services with significant reach or impact — namely Apple App Store, Google Play Store, Huawei App Gallery, Microsoft Store and Samsung Galaxy Store — to put in place system-level measures to curtail the risk of exposure to harmful content for users, especially children. Specifically, designated app stores must implement age assurance controls — methods more robust than self-declaration where risk warrants, such as device-level controls, account verification mechanisms, or privacy-preserving age assurance solutions — to prevent minors from accessing age-inappropriate apps or games.

From 1 April 2026, app stores offering services in Singapore are required to screen and prevent users estimated to be below 18 years old from downloading age-inappropriate apps. The Code applies to designated app distribution services and reinforces distribution platforms as key enforcement checkpoints. Singapore is among the first countries to mandate such age assurance measures for app distribution services, setting a model for other nations to follow.

Age-Restricted Content and Outcome-Based Compliance

ingapore’s content classification framework, explicit sexual content and pornography must be inaccessible to minors. IMDA emphasises outcome-based compliance: platforms must demonstrate that safeguards effectively prevent underage access in practice, and simple self-declared age is insufficient where foreseeable risks exist. This outcome-based standard places the burden on providers to demonstrate real-world effectiveness, rather than mere policy compliance.

Singapore remains technology-neutral: regulators do not mandate a specific verification method, but providers must select controls proportionate to risk and capable of withstanding regulatory scrutiny. Age assurance refers to systems or processes to establish a person’s age or age range, including age verification, age estimation, and age inference, enabling organisations to make age-related eligibility decisions with varying degrees of certainty.

Social Media Protection and Emerging Age Limits

Singapore does not currently impose a universal statutory minimum age for all social media use. However, designated services face heightened duties toward child users, including enforcing their own minimum-age policies, reducing children’s exposure to harmful content, and providing parental supervision tools. Officials have indicated they are monitoring international proposals and regional developments related to minimum social media age thresholds.

Data Protection and Privacy (PDPA)

The Personal Data Protection Act (PDPA) provides the baseline for personal data processing, including children’s data. It does not set a fixed digital age of consent but relies on meaningful consent and the individual’s capacity to understand the context.

For age assurance systems, this requires data minimisation, proportionality, and strict safeguards for identity or biometric data. Age verification data must not be repurposed for profiling or advertising, reinforcing Singapore’s expectation of privacy-preserving safety controls. Singapore’s privacy framework reinforces the use of privacy-enhancing technologies, purpose limitation, and strong safeguards when age assurance solutions process sensitive data.

Enforcement

IMDA has a wide enforcement toolkit, including financial penalties of up to S$1 million, binding remedial directions, and access-blocking orders. Singapore’s regime aligns with international safety-by-design principles and proactive platform responsibility, reflecting regulatory approaches emerging in the United Kingdom, European Union, Australia, and other jurisdictions.

PLEASE NOTE This page summarises current law and proposals and does not constitute legal advice. Always consult independent legal advisers before making compliance decisions.