Ireland’s Online Safety Code: what it means for online platforms and how to comply

profile picture Rachael Trotman 7 min read
An image of a child and their parent sitting on a sofa and using a mobile phone. The accompanying text reads 'Ireland's Online Safety Code - Ireland'.

What you need to know:

  • Ireland’s Online Safety Code will hold video-sharing platforms accountable for keeping their users, especially children, safe online.
  • Platforms with adult content, including pornographic or extremely violent content, must use age assurance to prevent children from accessing the content. These age assurance requirements come into force in July 2025.
  • Platforms that don’t comply can face strong penalties – up to €20 million or 10% of annual turnover.

 


 

From July 2025, video-sharing platforms in Ireland with pornography or extremely violent content will need to introduce age assurance to protect children from accessing their content.

In this blog, we take a look at the Online Safety Code requirements and what it means for online platforms.

 

What is the Online Safety Code?

The Online Safety Code will hold video-sharing platforms accountable for keeping their users, especially children, safe online. Platforms with pornography or extremely violent content will need to use age assurance to prevent children from accessing the content. The Code also requires video-sharing platforms to restrict certain categories of video and associated content, so that users cannot upload or share the most harmful types, including cyber bullying, the promotion of either eating disorders, self-harm or suicide. 

The Code is part of Coimisiún na Meán’s (Ireland’s media regulator) overall Online Safety Framework. This Framework, which includes the Online Safety and Media Regulation Act (“the 2022 Act”), makes digital services responsible for how they protect users from harm online.

It is part of a global movement towards greater accountability, regulation and safety online. 

 

What is the aim of the Online Safety Code?

The Online Safety Code, along with the other elements of the Online Safety Framework, will hold online video-sharing platforms accountable for keeping their users, especially children, safe online.

Online safety laws, including Ireland’s Online Safety Code, are not about excluding children from the internet, but about giving them an experience appropriate for their age. Just like children can’t freely walk into a casino or nightclub, regulators are introducing stronger protections online to ensure children can’t access content, experiences or services unsuitable for their age. 

 

Which platforms are in scope?

Video-sharing platforms who have their EU headquarters in Ireland will need to comply with the Online Safety Code.

Platforms which do not comply can face severe penalties, including up to €20 million or 10% of annual turnover, whichever is higher.

 

How do online platforms comply with the Online Safety Code?

The Code places greater responsibility on video-sharing platforms to protect people from harmful content.

Under the Code, platforms must take proactive steps to ensure a safer online environment by:

  • Banning harmful content such as cyberbullying, material promoting self-harm, suicide, or eating disorders, and any content that incites hatred, violence, terrorism, racism, xenophobia, or contains child sexual abuse materials.
  • Implementing robust age assurance systems to protect children from accessing pornography and extreme violence.
  • Giving parents the tools to help their children stay safe, including limiting the time they spend online, what types of content they see and who can see their child’s content online. A video-sharing platform whose terms restrict users under 16 must provide parental control mechanisms to help parents restrict the type of content their child can view.

In addition to these protections, platforms are expected to provide clear mechanisms for users to report content that violates the rules and to act swiftly and appropriately in line with their Terms and Conditions.

 

When does the Code come into effect?

The Code is split into two parts – Part A and Part B.

Part A came into effect on 18th November 2024 and covers general rules for video content that:

  • Harms the physical, mental, or moral development of children
  • Incites hatred or violence based on things like sex, race, ethnicity, religion, disability, or sexual orientation
  • Includes EU criminal content (like child sexual abuse material, terrorism, racism, or xenophobia that promotes hate or violence)
  • Involves harmful or illegal ads in videos

Part B comes into effect on 21st July 2025. It provides more detailed rules for video content that:

  • Includes cyberbullying, promoting or sharing methods of self-harm or suicide (including dangerous challenges), and promoting eating disorders
  • Encourages hatred or violence based on factors like sex, race, ethnicity, religion, disability, sexual orientation, or membership in the Traveller or Roma communities
  • Includes EU defined criminal content (like child sexual abuse material, terrorism, racism, or xenophobia that promotes hate or violence)
  • Includes harmful or illegal ads 
  • Includes restricted user-generated content (such as harmful comments or captions)
  • Is adult-only content, like pornography or extreme violence

 

Which age assurance methods can be used?

The regulator Coimisiún na Meán has said that self declaration, such as ticking a box or entering a date of birth, is not effective and will not be allowed. 

The regulator has not defined specific age assurance methods which must be used, but they have said platforms must use “effective age assurance measures.” 

There is also a focus on minimal data collection by not asking users to share more information than necessary. This includes the requirement to ensure that any children’s personal data collected for age assurance or parental controls is not processed for commercial purposes, such as direct marketing, profiling and behavioural targeted advertising.

 

How can Yoti’s age assurance methods help platforms comply with Ireland’s Online Safety Code?

We have a number of highly effective, privacy-preserving methods which can help video-sharing platforms to comply. These include:

  • Facial age estimation – this is recognised by Coimisiún na Meán as an effective age assurance method. Our technology accurately estimates a person’s age from a selfie. The same image is used for a liveness check to make sure it’s a real person taking the selfie, and not a photo, video, mask or deepfake of someone else. Once the technology returns an estimated age, the image is deleted. No documents are needed and no personal details are shared.
  • Digital ID – users can securely share a verified age attribute, such as ‘over 18’ with a platform. Alternatively, they can have their age estimated in the app using our facial age estimation technology, and then share their estimated age with the platform.
  • Identity document – users can verify their age by uploading a government-issued identity document such as a passport or driving licence. We compare the image on the document to an image of the person uploading it, ensuring the correct person is using the document. Only the verified age or age range (for instance 18+) is passed to the platform; no other details from the document are shared or retained. 

When deciding which age assurance methods to use, platforms should determine how definite they need to be about users’ ages. Does the platform need to know an exact date of birth? Or is it enough to just know whether they’re over or under a particular age threshold?

Our solutions mentioned above allow users to share their age range, such as ‘over 18’ without sharing any other personal information. Knowing which age range a user falls into means platforms can protect children from harmful content and give users an experience appropriate for their age group.  

For effective, scalable and privacy-preserving age assurance that can help your platform meet the Online Safety Code, please get in touch.

Please note this blog has been prepared for informational purposes only. You should always seek independent legal advice.

Keep reading

Image of a man holding his mobile phone in one hand and a driving licence in the other hand. The accompanying text reads "Data Act - United Kingdom".

Understanding the UK’s new Data Act

The Data (Use and Access) Act, now known more simply as the “Data Act”, is a landmark piece of UK legislation that aims to reshape how individuals and businesses interact with digital data. It introduces provisions for a national digital identity trust framework, helping to foster trust in digital identities by ensuring that businesses adhere to strict standards during digital transactions.  This blog gives an overview of the Data Act and what this means for digital identities in the UK.    Why has the Government introduced the Data Act? The Government has said that the Act will “unlock the

9 min read
An image of a young girl using a laptop. The accompanying text reads 'Organic Law for the Protection of Minors in Digital Environments - Spain'.

Understanding age assurance in Spain's new online safety law

As digital technology continues to shape how people interact, communicate and consume content, protecting children online has become an increasingly urgent issue. Recognising this, the Spanish government has proposed the Organic Law for the Protection of Minors in Digital Environments. The law is now in its final stages of approval. While comparable initiatives such as the UK’s Online Safety Act and California’s Age-Appropriate Design Code exist in other jurisdictions, the Spanish law stands out for its broad scope and emphasis on enforceable age assurance, platform accountability and digital literacy. Its comprehensive framework places it among the leading examples of

11 min read
An image of a young male holding a mobile phone. The accompanying text reads 'Texas App Store Accountability Act: United States'.

Texas App Store Accountability Act: what it means for age assurance worldwide

The State of Texas has passed a landmark law – the App Store Accountability Act – that places legal responsibility for age checking squarely on app store operators. Utah was the first state to enact this type of legislation, now followed by Texas. This new regulatory shift has far-reaching implications for digital safety, privacy and innovation around the world. As an age assurance provider, we believe it’s critical to explain the significance of this development, highlight the practical challenges it raises, and offer a path forward that protects both users and platforms. One of the main weaknesses of this

8 min read