Ireland’s Online Safety Code: what it means for online platforms and how to comply

profile picture Rachael Trotman 7 min read
An image of a child and their parent sitting on a sofa and using a mobile phone. The accompanying text reads 'Ireland's Online Safety Code - Ireland'.

What you need to know:

  • Ireland’s Online Safety Code will hold video-sharing platforms accountable for keeping their users, especially children, safe online.
  • Platforms with adult content, including pornographic or extremely violent content, must use age assurance to prevent children from accessing the content. These age assurance requirements come into force in July 2025.
  • Platforms that don’t comply can face strong penalties – up to €20 million or 10% of annual turnover.

 


 

From July 2025, video-sharing platforms in Ireland with pornography or extremely violent content will need to introduce age assurance to protect children from accessing their content.

In this blog, we take a look at the Online Safety Code requirements and what it means for online platforms.

 

What is the Online Safety Code?

The Online Safety Code will hold video-sharing platforms accountable for keeping their users, especially children, safe online. Platforms with pornography or extremely violent content will need to use age assurance to prevent children from accessing the content. The Code also requires video-sharing platforms to restrict certain categories of video and associated content, so that users cannot upload or share the most harmful types, including cyber bullying, the promotion of either eating disorders, self-harm or suicide. 

The Code is part of Coimisiún na Meán’s (Ireland’s media regulator) overall Online Safety Framework. This Framework, which includes the Online Safety and Media Regulation Act (“the 2022 Act”), makes digital services responsible for how they protect users from harm online.

It is part of a global movement towards greater accountability, regulation and safety online. 

 

What is the aim of the Online Safety Code?

The Online Safety Code, along with the other elements of the Online Safety Framework, will hold online video-sharing platforms accountable for keeping their users, especially children, safe online.

Online safety laws, including Ireland’s Online Safety Code, are not about excluding children from the internet, but about giving them an experience appropriate for their age. Just like children can’t freely walk into a casino or nightclub, regulators are introducing stronger protections online to ensure children can’t access content, experiences or services unsuitable for their age. 

 

Which platforms are in scope?

Video-sharing platforms who have their EU headquarters in Ireland will need to comply with the Online Safety Code.

Platforms which do not comply can face severe penalties, including up to €20 million or 10% of annual turnover, whichever is higher.

 

How do online platforms comply with the Online Safety Code?

The Code places greater responsibility on video-sharing platforms to protect people from harmful content.

Under the Code, platforms must take proactive steps to ensure a safer online environment by:

  • Banning harmful content such as cyberbullying, material promoting self-harm, suicide, or eating disorders, and any content that incites hatred, violence, terrorism, racism, xenophobia, or contains child sexual abuse materials.
  • Implementing robust age assurance systems to protect children from accessing pornography and extreme violence.
  • Giving parents the tools to help their children stay safe, including limiting the time they spend online, what types of content they see and who can see their child’s content online. A video-sharing platform whose terms restrict users under 16 must provide parental control mechanisms to help parents restrict the type of content their child can view.

In addition to these protections, platforms are expected to provide clear mechanisms for users to report content that violates the rules and to act swiftly and appropriately in line with their Terms and Conditions.

 

When does the Code come into effect?

The Code is split into two parts – Part A and Part B.

Part A came into effect on 18th November 2024 and covers general rules for video content that:

  • Harms the physical, mental, or moral development of children
  • Incites hatred or violence based on things like sex, race, ethnicity, religion, disability, or sexual orientation
  • Includes EU criminal content (like child sexual abuse material, terrorism, racism, or xenophobia that promotes hate or violence)
  • Involves harmful or illegal ads in videos

Part B comes into effect on 21st July 2025. It provides more detailed rules for video content that:

  • Includes cyberbullying, promoting or sharing methods of self-harm or suicide (including dangerous challenges), and promoting eating disorders
  • Encourages hatred or violence based on factors like sex, race, ethnicity, religion, disability, sexual orientation, or membership in the Traveller or Roma communities
  • Includes EU defined criminal content (like child sexual abuse material, terrorism, racism, or xenophobia that promotes hate or violence)
  • Includes harmful or illegal ads 
  • Includes restricted user-generated content (such as harmful comments or captions)
  • Is adult-only content, like pornography or extreme violence

 

Which age assurance methods can be used?

The regulator Coimisiún na Meán has said that self declaration, such as ticking a box or entering a date of birth, is not effective and will not be allowed. 

The regulator has not defined specific age assurance methods which must be used, but they have said platforms must use “effective age assurance measures.” 

There is also a focus on minimal data collection by not asking users to share more information than necessary. This includes the requirement to ensure that any children’s personal data collected for age assurance or parental controls is not processed for commercial purposes, such as direct marketing, profiling and behavioural targeted advertising.

 

How can Yoti’s age assurance methods help platforms comply with Ireland’s Online Safety Code?

We have a number of highly effective, privacy-preserving methods which can help video-sharing platforms to comply. These include:

  • Facial age estimation – this is recognised by Coimisiún na Meán as an effective age assurance method. Our technology accurately estimates a person’s age from a selfie. The same image is used for a liveness check to make sure it’s a real person taking the selfie, and not a photo, video, mask or deepfake of someone else. Once the technology returns an estimated age, the image is deleted. No documents are needed and no personal details are shared.
  • Digital ID – users can securely share a verified age attribute, such as ‘over 18’ with a platform. Alternatively, they can have their age estimated in the app using our facial age estimation technology, and then share their estimated age with the platform.
  • Identity document – users can verify their age by uploading a government-issued identity document such as a passport or driving licence. We compare the image on the document to an image of the person uploading it, ensuring the correct person is using the document. Only the verified age or age range (for instance 18+) is passed to the platform; no other details from the document are shared or retained. 

When deciding which age assurance methods to use, platforms should determine how definite they need to be about users’ ages. Does the platform need to know an exact date of birth? Or is it enough to just know whether they’re over or under a particular age threshold?

Our solutions mentioned above allow users to share their age range, such as ‘over 18’ without sharing any other personal information. Knowing which age range a user falls into means platforms can protect children from harmful content and give users an experience appropriate for their age group.  

For effective, scalable and privacy-preserving age assurance that can help your platform meet the Online Safety Code, please get in touch.

 

Please note this blog has been prepared for informational purposes only. You should always seek independent legal advice.

Keep reading

A guide to Italy’s AGCOM new age verification regulations for adult content

Italy’s new AGCOM regulations for adult content, which come into effect on 12th November, emphasise two key points.  The verifier must not know which platform or site for which the check will be used. The content provider must not obtain identifying personal data of the user, only a result like: ‘user is over 18’. This is known as the double blind method – ARGOM use the term double anonymity.  Secondly, verification must be robust and each session, or visit, must be age-checked. ‘One and done’ age checks are not sufficient for repeat visits. Even if a user creates an

4 min read

Italy’s new age verification regulation for online platforms

In May 2025, AGCOM, Italy’s Communications authority, adopted Resolution No. 96/25/CONS, which formally approves the rules for online age verification of users accessing adult content. Similar to the UK’s OFCOM and France’s ARCOM regulators, AGCOM requires websites with pornographic content to implement robust age verification checks for users to ensure they are over 18. All three regulations are designed to protect minors from accessing age restricted content.  Italy’s AGCOM has specified two important requirements: Checks must be double blind – that is, age verification providers cannot see what platform is submitting the check, and the platform cannot see any

2 min read
An image of a man and a women, sitting in an office setting and looking at a laptop.

UK Government to formally recognise digital identities in new MLR guidance

In a major policy development, the UK Government has announced new guidance that will formally recognise certified digital identities as valid tools for complying with the UK’s Money Laundering Regulations (MLRs). HM Treasury and the Department for Science, Innovation & Technology (DSIT) made the announcement last month as part of a wider consultation response on Money Laundering Regulations. The public consultation had nearly 200 industry responses, from sectors including finance, tech, regulators and civil society. It aims to resolve longstanding uncertainty over how regulated firms can use digital identity services to meet anti‑money laundering (AML) and customer due diligence

6 min read