The Online Safety Bill has now been approved by the House of Lords and goes on for Royal Assent, where the ruling monarch signs off on new law. The Bill covers a wide range of issues including minimising the risk of children seeing harmful and age-inappropriate content, removing illegal content like child sexual abuse material (CSAM), criminalising fraudulent and scam ads, and introducing age verification for certain online services.
This blog looks at some of the age requirements in the Bill and what this means for tech companies, adult sites, gaming companies, social media platforms and dating sites.
What is the purpose of the Online Safety Bill?
According to the UK government, the Online Safety Bill aims to make the UK ‘the safest place in the world to be online’. It primarily seeks to protect children and adults from online content that is harmful or illegal. The Bill aims to create a safer digital environment that prioritises user safety and holds technology companies accountable for their actions.
As the appointed regulator, Ofcom will ensure companies are proactively assessing the risks of harm to their users and introducing safeguards to keep them safe online.
The Bill is currently completing its journey through Parliament. We expect regulators to progressively consult the industry, produce codes, and start enforcing the new rules in the 12 months following Royal Assent.
Is this the first piece of UK legislation which addresses online safety?
The Audiovisual Media Services Directive (AVMSD), which is a 2018 EU Directive that the UK implemented, has similar provisions to the Online Safety Bill. Its scope however was very limited, and concerned only video-on-demand platforms such as Netflix and Amazon Prime, where content is curated by a provider and there are usually no user-to-user functions. Ofcom was the regulator chosen to enforce this Directive.
In 2021, the Information Commissioner’s Office introduced the Age-Appropriate Design Code (also called the AADC or Children’s Code) as required by the Data Protection Act (DPA) 2018. The Code is enforceable under the UK GDPR and DPA and imposes a set of standards that seek to ensure online services are designed in the best interests of a child.
Who is impacted by age verification requirements within the Online Safety Bill?
Companies that have a ‘significant’ number of users in the UK, or whose services can be accessed by a UK user and present online risks, such as platforms with user-generated content. This will include gaming companies, dating sites, social media platforms and adult sites.
These companies will have to review, and possibly adapt, the way they design, operate, and moderate their platforms to ensure they meet the aims of the new online safety regulation.
Under the provisions of the Bill, all regulated services will have a duty of care in relation to illegal content and if services are deemed accessible by children, a duty to protect children from harm.
All online services will be designated as one of three categories, dependent on their number of users and the functionalities of that service. However, the thresholds for each category have not yet been determined and will be set out in secondary legislation.
How will companies comply with the Online Safety Bill?
The Online Safety Bill will impose a ‘duty of care’ on platforms, which means they will have a duty to ensure their users are kept safe whilst using their services. They will have to complete risk assessments and develop proactive solutions on how to address potential harms.
Components of the Bill include:
- Preventing the appearance of illegal content and removing it quickly when it does
- Preventing children from accessing content that is harmful or inappropriate for their age
- Introducing age thresholds for the use of certain services based on the risk they present, and implementing robust methods to check the age of users
- Introducing a duty of transparency for large platforms and social media sites to disclose the methodology of their risk analysis, preventative measures put in place and their efficiency
- Providing parents and children with clear ways to report problems online
How will the age verification requirements in the Bill impact social media platforms and tech companies?
To comply with the regulations, social media platforms and tech companies will have a duty of care to keep children safe online. They will need to develop systems for detecting and removing harmful content and enforce stricter age limits. They will need to provide filtering tools that give users more control over the content they see. Companies should also provide better protection, particularly for children, from cyberbullying, online harassment, hate speech and child exploitation.
In addition, the Online Safety Bill also introduces a number of new criminal offences: a false communication offence; a threatening communication offence; an offence of sending flashing images with the intent of causing epileptic seizures; and a cyber-flashing offence (the sending of unsolicited nude images via social media or dating apps).
Social media platforms will need to introduce stricter age limits and explain in their terms of service how they implement these age limits.
How will the age verification requirements in the Bill impact adult sites?
Research by the Children’s Commissioner discovered that the average age children first see pornography is 13, and 38% of 16-21-year-olds have accidentally been exposed to pornographic content online.
Members of Parliament have chosen to require services that publish or allow pornography on their sites to explicitly use age verification or age estimation measures to prevent children from accessing this content. An amendment submitted by the Lords and approved by the government will ensure that platforms will be held to a higher account, and need to use age-checking measures which are highly effective at correctly determining whether or not a particular user is a child.
How will the age verification requirements in the Bill impact gaming companies?
93% of children in the UK play video games. As such, many gaming companies will be impacted by the Bill and have to play their part in keeping children safe online.
Online video games will be in the scope of the Bill if they:
- offer user-to-user interaction or allow user-generated content
- contain written chat functionality or group chat options
- have players in the UK
Like other companies impacted by the Bill, gaming platforms will need to comply with the general duties imposed on all regulated services. Gaming companies will also need to carry out child risk assessments to determine whether their game is played by children.
Gaming platforms will need to implement age assurance measures so they know the age or age range of their players. Once they know the age of users, they can then deliver an age-appropriate experience. This might involve limiting certain high-risk features for specific age groups, such as voice chat or age-gating certain content if it’s deemed inappropriate for players under a certain age.
How will the age verification requirements in the Bill impact dating sites?
The Bill will include new criminal offences for cyber-flashing – the sending of unsolicited nude images via dating apps. It will also aim to tackle romance fraud, which sees people tricked into sending money to scammers on dating sites.
The regulator has not as yet issued guidance with regards to how dating sites will be impacted by the Online Safety Bill. Something the Online Dating Association (ODA) has championed is for Ofcom to closely match its Online Safety Bill guidance with the ICO’s guidelines on the Children’s Code (or Age Appropriate Design Code).
The ICO’s guidance says that if a significant number of children are likely to access a service, even if it’s not designed for children, dating sites should introduce robust age checking or conform with the standards in the Children’s Code. The ICO’s guidance includes two dating sector-specific use cases, which look at two different types of dating services – those likely to have attempted access by minors and those that are unlikely.
Whilst we wait for Ofcom to publish its own guidelines on how companies can comply with the Online Safety Bill, dating sites could be reviewing their platform. If they are an 18+ site, they should be considering what percentage of users might be underage, or looking at evidence that might indicate if underage people are likely to engage with members of the dating site.
A number of dating sites have already started looking into age assurance technology and how it can improve user safety and trust.
How Yoti can help
The UK’s Online Safety Bill aims to improve internet safety by making online platforms more responsible for regulating and reducing harmful content. It aims to find a balance between protecting users, especially children and vulnerable adults, whilst preserving freedom of expression.
When it comes to the age verification requirements, it will be important for regulated companies to balance effective age checking with privacy. We believe that people should be able to choose between different methods to prove their age online. This will be essential for making sure age checking is inclusive and accessible for everyone.
Yoti offers a range of age assurance options to help platforms of all types and sizes comply with the regulations. This includes our privacy-preserving facial age estimation, which lets people prove their age without using identity documents. Platforms like OnlyFans, Instagram, Yubo and Facebook Dating are already using this technology to keep minors safe and create age-appropriate experiences.
To find out how we can help you comply with age assurance for the Online Safety Bill, please get in touch.