Understanding age assurance in Spain’s new online safety law

profile picture Amba Karsondas 11 min read
An image of a young girl using a laptop. The accompanying text reads 'Organic Law for the Protection of Minors in Digital Environments - Spain'.

As digital technology continues to shape how people interact, communicate and consume content, protecting children online has become an increasingly urgent issue.

Recognising this, the Spanish government has proposed the Organic Law for the Protection of Minors in Digital Environments. The law is now in its final stages of approval.

While comparable initiatives such as the UK’s Online Safety Act and California’s Age-Appropriate Design Code exist in other jurisdictions, the Spanish law stands out for its broad scope and emphasis on enforceable age assurance, platform accountability and digital literacy. Its comprehensive framework places it among the leading examples of child online safety legislation worldwide. 

This blog explores Spain’s proposed Organic Law, focusing on how platforms can comply with its age assurance requirements.

 

What is the purpose of Spain’s proposed Organic Law?

In July 2024, the Spanish government proposed a preliminary draft of the legislation. After its second approval on 25 March by the Council of Ministers, the updated proposed Organic Law is set to become a key part of Spain’s legal framework for digital governance.

The law’s primary goal is to safeguard children from online harms whilst allowing them to still benefit from the opportunities offered by the digital world. It aims to protect children from harmful content, prevent online exploitation and promote responsible online behaviour.

 

Who is the proposed Organic Law aimed at?

This piece of legislation impacts a broad range of parties who either operate in Spain or target Spanish users, including:

  • Digital platforms and online services such as social media platforms, video-sharing platforms and other digital content services
  • Manufacturers of digital devices with an internet connection including smartphones, tablets, computers and smart TVs
  • Influencers and content creators
  • Video game developers and publishers
  • Healthcare and educational institutions

 

What are the proposed Organic Law’s key measures?

The legislation introduces several interconnected measures to improve children’s safety in digital environments.

 

Raising the minimum age for social media access

The law raises the minimum age for minors to access and register on social media platforms from 14 to 16 years old. This increase reflects the concern that minors may lack the maturity to fully understand the consequences of giving legal consent to the processing of their personal data online.

Children under 16 will need explicit parental consent to create and manage accounts on digital platforms. These include social media networks, gaming sites and other interactive platforms.

 

Introducing robust age verification and content classification

All platforms that host age-sensitive content, including social media sites, must be classified based on recommended age. Platforms should clearly label and communicate the classification to users, with risk information provided in simple language.

Platforms must also implement robust and secure age verification systems to ensure young people can only access age-appropriate content. Self declaration, such as checking a “tick box”, is not acceptable.

 

Implementing parental control systems

The law seeks to empower guardians to protect their children online. It mandates that manufacturers of digital devices with an internet connection offer effective, easy-to-use and free parental control systems. Systems must be pre-installed and activated by default when the device is initially set up. These tools will allow guardians to:

  • Restrict access to inappropriate content and apps.
  • Monitor screen time and online behaviour.
  • Set usage schedules and device access limitations.

Any personal data collected or created when this feature is used cannot be utilised for commercial purposes like targeted ads, profiling or direct marketing. This shift places greater responsibility on device manufacturers, rather than solely on app developers or parents.

 

Regulating influencers and content creators

The proposed Organic Law recognises the influence of online personalities on young people. The law proposes stricter regulations for influencers and digital content creators. Those with large followings must:

  • Comply with content regulations similar to those of traditional media, including the type of content they publish and restrictions on time slots.
  • Implement effective age checks to establish the ages of their users.
  • Avoid publishing content containing gratuitous violence, sexual themes or other material inappropriate to users identified as children.
  • Clearly label sponsored or promotional content to avoid deceptive marketing practices.

 

Restrictions on video game developers and publishers

The law prohibits video game developers and publishers from allowing users under the age of 18 to access or activate random reward mechanisms, commonly known as “loot boxes”. These are considered potentially addictive and comparable to gambling.

 

Criminalising harmful online behaviours

The law includes modifications to the Spanish Criminal Code. It introduces several criminal offences related to online activities, including:

  • Sexually explicit deepfake content  Criminalising the creation and distribution of AI-generated or manipulated explicit content involving minors. It penalises those who share or display digitally generated, altered or recreated images or audio portraying sexual or seriously degrading content – without the consent of the person involved and with the intent to harm their moral integrity. 
  • Digital restraining ordersCourts can impose restrictions on individuals to prevent them from contacting or interacting with minors through online platforms.
  • Online grooming and harassment Stronger penalties are introduced for adults who engage in online grooming, cyberstalking or the harassment of minors. This includes using fake identities online, such as lying about age, to commit crimes.

 

Introducing measures to protect victims of gender-based or sexual violence

A key addition in this draft law, compared to the earlier version, is the introduction of Title III. This is a structural addition to the current version of the text, setting out new measures to protect victims of gender-based and sexual violence. It clearly states that people who have experienced such violence in digital spaces are now officially recognised as victims under Organic Law 1/2004 and Organic Law 10/2022.

The law also guarantees children free and ongoing access to 24/7 support services, both by telephone and online. This includes psychological care, legal advice, crisis centres and specialised shelters.

 

Promoting digital literacy and education

The law includes a focus on education. It proposes the development of a national digital literacy strategy for minors, educators and parents. It also establishes public digital culture laboratories. These aim to be spaces where young people can safely explore digital creativity and innovation.

It also promotes programmes to educate students about digital rights, data privacy, online ethics and the risks of digital platforms. This approach emphasises that children need digital literacy skills alongside technical safeguards to navigate the internet responsibly.

 

Health promotion and addiction prevention

The law also introduces guidelines for the prevention of digital addiction. These will be issued by health authorities. This measure aims to position digital wellbeing as part of a child’s overall health.

Pediatric digital wellness checks will be incorporated into routine primary care visits. Pediatricians will include questions about screen time and digital device usage into health consultations and educational programmes. This will allow them to assess digital habits and advise families accordingly.

 

How does the Draft Organic Law interact with existing legislation?

The Draft Organic Law builds on existing frameworks such as the General Law on Audiovisual Communication (LGCA), which regulates the audiovisual media services in Spain.

It also works alongside the Law on the Protection of Personal Data and Guarantee of Digital Rights. This adapts the GDPR’s provisions to the Spanish legal framework.

The new piece of legislation aims to create a more comprehensive approach to online child protection by:

  • Addressing areas not adequately covered by existing legislation, such as the regulation of digital platforms and services.
  • Aligning national regulations with international standards and best practices in child online safety.
  • Providing clearer guidelines and stronger enforcement mechanisms to ensure compliance and accountability.

 

Who will enforce the proposed Organic law?

Enforcement of the proposed Organic Law will involve several authorities, including:

  • The Spanish Data Protection Agency (AEPD) – will be responsible for overseeing compliance with data protection aspects of the law.
  • National Commission on Markets and Competition (CNMC) – will oversee age verification requirements and can stop video-sharing platforms from operating if they commit very serious violations by not setting up proper age verification systems.
  • Consumer protection agencies – will ensure digital products and services meet safety standards.
  • Educational authorities – will implement educational initiatives and promote digital literacy in schools.
  • Law enforcement agencies – will be responsible for investigating and prosecuting online criminal activities such as grooming and abuse.

 

What are the proposed Organic Law’s age verification requirements?

The law requires platforms to implement effective age verification systems. They must:

  • confirm that a user is old enough to access certain content rather than verifying whether someone is a minor.
  • clearly determine if a user is “authorised” to access content using non-invasive methods. These should not rely on collecting extra data such as gender or ethnicity.
  • be anonymous, with all processing done on the user’s device. It should not share personal data or create traceable patterns.
  • not rely on profiling or continuous user tracking. Users should only need to prove their age for accessing age-restricted content.
  • avoid profiling users based on browsing habits and should label content with clear descriptors (e.g., violent, sexual) to match age-appropriate access.
  • not track user activity across services. Platforms should use local data processing to prevent tracking, locating or identifying minors online. Systems should avoid using centralised databases to protect children’s privacy.

Digital service providers must ensure age verification mechanisms comply with these guidelines. The proposed Organic Law does not outline specific age assurance mechanisms that platforms and services can use. The CNMC will evaluate the adequacy of the age verification systems implemented by platforms.

Additionally, companies must conduct a Data Protection Impact Assessment (DPIA) when introducing or modifying age verification systems to ensure compliance with data protection laws and to assess any potential risks to minors.

 

How can platforms comply with the proposed Organic Law’s age assurance requirements?

We believe platforms should strike a balance between robust age checks and protecting user privacy. Offering individuals a choice of effective methods to prove their age online is key to achieving this. Providing a range of options helps ensure that age assurance is both inclusive and accessible to all users.

We offer over 10 ways for people to prove their age including:

  • Identity documents – Users can verify their age by uploading a government-issued identity document, such as a passport or driving licence. The photo on the document is compared with a live image of the person uploading it to confirm their identity. Once verified, the only information shared with the platform is a data minimised age result (e.g. ‘over 18’).
  • Facial age estimation – This method estimates a person’s age from a selfie. The same image is also used to check for liveness, ensuring it’s a real person taking the selfie, and not a photo, video or mask of someone else. No documents are required, and no personal details are shared. Once the technology returns an estimated age, the image is immediately deleted.
  • Digital ID app – Users can securely share their age with a platform via a Digital ID app. They can add their age using a verified identity document or choose facial age estimation within the app to determine their age without needing to upload documents.

Alongside these, we can check age using email addresses, databases and national eID schemes.

 

What’s next for the proposed Organic Law?

The proposed Organic Law is currently in its final stages of processing and approval. Once passed, there will be a transition period for businesses to comply with its requirements. 

Non-compliance after the implementation period can lead to penalties, including administrative, civil and criminal sanctions.

We will update this blog as more information is released. Future updates will include details such as the exact amounts of fines, timeframes for compliance, the types of infractions associated with each sanction, potential civil sanctions and civil liability for damages.

 

Effective age checks to protect children under Spain’s proposed Organic Law

The proposed Organic Law for the Protection of Minors in Digital Environments represents a major shift in children’s online safety in Spain.

By requiring effective age assurance measures, regulating influencers, promoting digital literacy and criminalising harmful behaviours, the law aims to prioritise the wellbeing of young people in the digital world.

If you want to learn more about how your business can implement effective age checks, get in touch.

Keep reading

An image of a young male holding a mobile phone. The accompanying text reads 'Texas App Store Accountability Act: United States'.

Texas App Store Accountability Act: what it means for age assurance worldwide

The State of Texas has passed a landmark law – the App Store Accountability Act – that places legal responsibility for age checking squarely on app store operators. Utah was the first state to enact this type of legislation, now followed by Texas. This new regulatory shift has far-reaching implications for digital safety, privacy and innovation around the world. As an age assurance provider, we believe it’s critical to explain the significance of this development, highlight the practical challenges it raises, and offer a path forward that protects both users and platforms. One of the main weaknesses of this

8 min read
Young girl looking at smartphone

Yoti responds to the Draft Statement of Strategic Priorities for online safety

Last week, the Department of Science, Innovation and Technology, published the final draft Strategic Priorities for online safety. We welcome the statement, which highlights the five areas the government believes should be prioritised for creating a safer online environment. These areas are: safety by design, transparency and accountability, agile regulation, inclusivity and resilience, and technology and innovation. These priorities will guide Ofcom as it enforces the Online Safety Act, ensuring platforms stay accountable and users are protected. We welcome this clear direction and commitment from the government to create safer online spaces. It’s positive that age assurance has been

9 min read
Image of a man holding his mobile phone in one hand and a driving licence in the other hand. The accompanying text reads "Data Bill - United Kingdom".

Understanding the UK’s new Data Bill

The Data (Use and Access) Bill, known more simply as the “Data Bill”, is a landmark piece of UK legislation that aims to reshape how individuals and businesses interact with digital data. It will introduce provisions for a national digital identity trust framework, helping to foster trust in digital identities by ensuring that businesses adhere to strict standards during digital transactions.  This blog gives an overview of the Data Bill and what this means for digital identities in the UK.    Why has the Government introduced the Data Bill? The Government has said that the Bill will “unlock the

10 min read