Spain
Spain’s online safety framework is based on EU law, national audiovisual regulation and child-protection legislation, combined with an increasingly interventionist policy agenda on age assurance and youth wellbeing online. In 2026, Spain significantly escalated its regulatory approach with a planned nationwide social media ban for under-16s and new executive liability measures.
National Legal Framework and Regulators
Spain regulates online content and platform safety through:
- Audiovisual regulation
- Child-protection and consumer protection law
- Directly applicable EU legislation
CNMC
The Comisión Nacional de los Mercados y la Competencia (CNMC) supervises audiovisual media services and video-sharing platforms under Spanish and EU-derived law.
Spain has transposed the Audiovisual Media Services Directive (AVMSD) into domestic law. This requires video-sharing platforms to protect minors from harmful content, including:
- Pornography
- Extreme violence
These duties may include the use of effective age verification or age assurance mechanisms where appropriate.
Data Protection Authority
The Agencia Española de Protección de Datos (AEPD) enforces data protection law and plays a central role in child online safety, particularly in cases involving profiling, harmful content exposure and inadequate age checks.
GDPR and Children’s Data
The EU General Data Protection Regulation (GDPR) applies directly in Spain.
Under Spanish law, the digital age of consent is 14. Children under 14 cannot lawfully consent to the processing of personal data without parental authorisation.
Practical consequences for online services include:
- Platforms relying on consent must determine whether users are at least 14
- Services that do not know users’ ages cannot safely rely on consent
- Children’s data requires heightened safeguards and risk assessment
The AEPD has taken enforcement action where platforms failed to implement adequate protections for minors or relied on ineffective age-declaration mechanisms.
Digital Services Act (DSA)
The EU Digital Services Act (DSA) applies directly in Spain.
Spain has designated the CNMC to act as the Digital Services Coordinator.
Under Article 28 DSA, platforms likely to be accessed by minors must take appropriate and proportionate measures to protect children, including:
- Limiting exposure to harmful or age-inappropriate content
- Addressing design features that may cause harm
- Mitigating risks arising from recommender systems
Platforms operating in Spain may face scrutiny for:
- Failure to protect minors
- Ineffective age assurance systems
- Engagement-driven or addictive design features
Social Media Ban (Under-16s)
On 3 February 2026, Prime Minister Pedro Sánchez announced a landmark plan to ban children under 16 from social media platforms, including services such as TikTok and Instagram.[1]
Key elements include:
- Draft legislation expected to proceed through the Council of Ministers in mid-February 2026
- A legal prohibition on account access for users under 16
- Mandatory technical enforcement mechanisms
This represents a significant escalation beyond earlier policy proposals and signals a move toward mandatory, enforceable age controls rather than reliance on platform self-regulation.
EU Digital Identity Wallet and Age Verification
Spain is leading development of the EU Digital Identity Wallet prototype, which moved into active testing in late 2025.
The Wallet is intended to:
- Enable users to prove age without disclosing full identity details
- Serve as the primary enforcement mechanism for the under-16 social media ban
- Support age verification for access to adult content and other restricted services
The Wallet has evolved from a policy announcement to an operational testing phase and forms a core component of Spain’s age assurance strategy.
Pornography and Age Verification
Spain has been at the forefront of EU-level discussions on mandatory age verification for access to online pornography.
The government has committed to requiring robust age verification for adult content websites accessible in Spain. This policy is linked to:
- Concerns about early exposure to pornography
- Child development and wellbeing
- Broader digital identity infrastructure initiatives
The EU Digital Identity Wallet is expected to play a central role in enforcing age checks for adult content.
Criminal Liability for Platform Executives
Spain’s 2026 regulatory package includes provisions to introduce criminal liability for technology company executives.
Under these measures:
- Tech CEOs may face criminal responsibility if their platforms fail to remove illegal content after notification
- Relevant content includes material such as:
- Sexual deepfakes
- Child sexual abuse material
This provision significantly strengthens enforcement by attaching personal liability to platform leadership in cases of non-compliance.