Current Region:
Global

Italy

Italy’s online safety framework combines EU law, national audiovisual regulation and data protection law, with a strong and increasingly enforcement-driven focus on age assurance and the protection of minors online. Italy has been particularly active in mandating age verification for access to pornography and in using data protection law to enforce age-related obligations.

National Legal Framework and Regulators

Online content and platform safety in Italy are regulated primarily through:

  • Audiovisual and media law
  • Data protection law

AGCOM

The Autorità per le Garanzie nelle Comunicazioni (AGCOM) is Italy’s main sector regulator for media and platforms. Its responsibilities include:

  • Supervising audiovisual media services and video-sharing platforms
  • Enforcing Italian law implementing the Audiovisual Media Services Directive (AVMSD)
  • Ensuring platforms adopt appropriate measures to protect minors from harmful content, including:
    • Pornography
    • Content likely to impair children’s physical, mental or moral development

Data Protection Authority (Garante)

Data protection oversight sits with the Garante per la protezione dei dati personali, which plays a central role in enforcing child safety obligations online. The Garante has been a key driver of age verification enforcement in Italy.

GDPR and Children’s Data

The EU General Data Protection Regulation (GDPR) applies directly in Italy.

Under Italian law, the digital age of consent is 14. Children under 14 cannot validly consent to the processing of their personal data without parental authorisation.

This has practical implications for online services:

  • Services relying on consent must be able to determine whether users are at least 14
  • Platforms that do not know a user’s age cannot safely assume valid consent
  • Processing children’s data requires enhanced safeguards and risk assessments

The Garante has taken repeated enforcement action where platforms:

  • Failed to adequately protect minors’ personal data
  • Relied on weak or purely declaratory age-assurance mechanisms

Digital Services Act (DSA)

The EU Digital Services Act (DSA) applies directly in Italy.

Italy has designated AGCOM as the Digital Services Coordinator, responsible for supervising and enforcing DSA obligations in cooperation with other competent authorities, including the Garante.

Under Article 28 DSA, platforms likely to be accessed by minors must take appropriate and proportionate measures to protect children, including:

  • Addressing harmful or age-inappropriate content
  • Mitigating risks arising from platform design and recommender systems
  • Implementing effective child-protection safeguards

Italian regulators increasingly expect platforms to demonstrate how children are protected in practice, rather than relying solely on stated policies or parental control tools.

Pornography and Mandatory Age Verification

Italy has taken some of the strongest enforcement action in Europe on age verification for access to pornography.

The Garante has issued binding orders requiring major adult content platforms to implement robust age-verification systems to prevent access by minors. These orders make clear that:

  • Self-declared age checks are insufficient
  • Easily circumvented measures do not meet legal standards
  • Age verification systems must be effective in practice

Italy’s approach is frequently cited at EU level as an example of how GDPR, child-protection principles and audiovisual regulation can be combined to mandate age assurance.

And since February 2026, the regulator has been clear that the regulation applies globally, not only to sites established in Italy.  (This follows clarity from the European Commission that Member States may enforce domestic laws in this field against services established in other Member States).

Social Media and Age Assurance

While Italian law sets a minimum age of 14 for valid data-processing consent, authorities have taken a broader approach to social media platforms where under-age access presents foreseeable risks.

The Garante has taken enforcement action where:

  • Children under 14 were able to register without effective age checks
  • Platforms enabled profiling of minors without lawful consent

Regulatory decisions have emphasised that:

  • Terms and conditions alone are insufficient
  • Platforms must take reasonable steps to verify age where risks to children are foreseeable

There has also been sustained political and regulatory discussion in Italy about strengthening age-assurance obligations for social media, particularly following high-profile cases involving harm to minors online.