Ireland’s Online Safety Framework and Regulator
Ireland’s online safety regime is now anchored in the Online Safety and Media Regulation Act 2022. It established a single statutory regulator, Coimisiún na Meán, responsible for protecting users and especially children from harmful online content and practices. The Commission took over from the old Broadcasting Authority of Ireland and has powers to regulate both broadcasting and online media.
The role of Coimisiún na Meán is multifaceted, centred on enforcing Ireland’s Online Safety Code which sets binding safety requirements for online platforms. In addition to acting as Ireland’s Digital Services Coordinator under the EU’s Digital Services Act (DSA), the Commission issues age assurance expectations and levies penalties for non-compliance. Its mandate extends to monitoring harmful content, such as sexual abuse material and extreme violence, while overseeing platform compliance with safety duties under both EU and Irish law.
Key EU Law Drivers
GDPR
The EU GDPR is directly applicable in Ireland and requires online services to process personal data lawfully, placing specific obligations on services handling children’s data. Under these rules, services must not rely on consent unless they can reliably establish a user’s age, as GDPR sets a digital age of consent that varies by Member State. Consequently, services that do not know a user’s age cannot assume valid consent for data processing. Furthermore, the use of children’s personal data and special category data, triggers enhanced safeguards and mandatory risk assessments. Ireland’s Data Protection Commission enforces these rules, and Coimisiún na Meán must consider data protection compliance when applying its own online safety duties.
Digital Services Act (DSA)
The Digital Services Act, in force since 2022, is a pan-EU regulation that harmonises platform safety duties across Member States. Under Article 28, platforms accessible to minors must take appropriate and proportionate measures to protect children online. This includes restricting harmful content and protecting the privacy, security, and safety of young users. Platforms are also required to take active steps to prevent exposure to inappropriate material, such as cyberbullying and self-harm content.
In July 2025, the European Commission issued child safety guidelines under Article 28 that outline practical approaches for platforms. While these guidelines are not legally binding on their own, they are widely regarded by EU regulators as de facto standards for compliance
Ireland’s Online Safety Code (In Force)
From July 2025, Ireland’s Online Safety Code introduced binding requirements for certain platform types, notably video-sharing platforms hosting adult or extremely violent content, to implement effective age assurance so that children cannot access unsuitable material. Platforms that fail to comply face significant penalties of up to €20 million or 10% of turnover. This requirement has led global platforms such as X to deploy age assurance in Ireland.
Interaction with EU Law
The Irish framework does not operate in isolation. GDPR imposes data protection duties across all online services, including age verification requirements where consent or children’s data is involved. Simultaneously, the DSA creates a supra-national requirement for platforms to protect minors across the EU, supplemented by guidance shaping how age assurance and content controls should be implemented. Coimisiún na Meán’s enforcement must align with these broader EU laws, as Ireland acts as both a domestic regulator and an EU Digital Services Coordinator.
Summary
Online safety law in Ireland today is shaped by the 2022 Act and the regulatory oversight of Coimisiún na Meán, alongside the Online Safety Code’s enforcement of age assurance. These domestic measures are underpinned by the EU GDPR’s governance of data processing and the EU Digital Services Act’s safety duties. Together, these elements create a multi-layered age assurance regime that is increasingly aligned with EU standards and evolving toward stronger identity verification while balancing privacy and safety.