Understanding verification requirements in the new Digital Services Act

profile picture Amba Karsondas 9 min read
Image of a young person sitting down and using their smartphone.

The EU’s new Digital Services Act (DSA) will apply from the 17th February this year. Originally only applicable to ‘very large online platforms’ (VLOPs) and ‘very large online search engines’ (VLOSEs) with over 45 million monthly users, it will eventually be expanded to cover all online intermediaries and platforms that offer their services to users based in the EU.

These include online marketplaces, social networks, adult content companies, content sharing platforms, app stores, and online travel and accommodation platforms. 

To comply with certain parts of the Act, platforms will need to have effective verification of business users and support age-appropriate design. This blog explains the DSA and how affected platforms can comply with its age and identity requirements.

 

What is the purpose of the Digital Services Act?

One key goal of the DSA is to create a safer online environment. It looks to do this by preventing illegal and harmful activities online, limiting the spread of disinformation and protecting the rights of users.

The new rules will ensure user safety, increase transparency and create a fair and open online environment. The wide-ranging obligations will protect consumers and their rights with clear and proportionate rules.

 

Who is impacted by the Digital Services Act?

The DSA will apply to platforms that provide people access to goods, services and content. The phrase ‘digital services’ refers to a range of online services including:

  • simple websites
  • internet infrastructure services
  • online marketplaces
  • social networks
  • content sharing platforms
  • hosting services
  • app stores
  • travel and accommodation platforms

Platforms must comply with the DSA if they operate in the EU or if there is a “substantial connection” to the EU. A substantial connection exists when the provider either:

  • has their main establishment in the EU
  • has a significant number of users in the EU
  • targets its activities towards at least one EU member state

Under this criteria, a business could have its headquarters outside the EU and still have to follow the new obligations. 

Where a business has its main establishment in a member state but operates across the EU, the host state will be responsible for enforcement. Where the company is based outside the EU, it’ll have to nominate a legal representative. VLOPs based outside the EU answer directly to the European Commission. As such, many UK businesses offering online services within the EU are significantly impacted.

The DSA has a tiered system of obligations. The tier that a company falls into depends on its size, the type of company and the impact that it may have on its users. Tier 1 has the least strict requirements and these get progressively more stringent for higher tiers.

 

Is this the first piece of EU legislation of its kind?

The DSA is an updated version of the EU’s Electronic Commerce Directive from 2000. Since then, technology has advanced significantly which in turn requires more comprehensive legislation.

The DSA refreshes the EU’s framework for illegal content on online platforms. Crucially, it also aims to harmonise the various laws that have emerged at national levels across individual EU states. This is to ensure consistency across the European single market.

The DSA will sit alongside the Digital Markets Act. Together, this package will form a single set of rules that apply across the whole of the EU and EEA. Its two main goals are:

  • to create a safer digital space in which the fundamental rights of all users of digital services are protected
  • to establish a level playing field to foster innovation, growth and competitiveness, both in the European single market and globally.

 

Why has the Digital Services Act been introduced?

People are turning to the internet to complete an increasing number of everyday tasks. We’re using it to communicate, shop, order goods, find information and stream entertainment. Ultimately, for many of us, digital services make our lives easier. 

But whilst there are many benefits of these digital services, we need to be aware of the problems they can bring. Online services could be used to:

  • trade and exchange illegal goods, services and content 
  • amplify the spread of disinformation
  • facilitate online abuse and exposure to harmful content
  • collect personal data without user consent

 

What does the Digital Services Act mean for online platforms?

Companies must meet a vast array of obligations under the DSA. These include:

  • creating user-friendly tools for users to report illegal content
  • a ban on targeted advertisements to minors on online platforms
  • explanations to users detailing why their content has been restricted or removed
  • having an internal complaints system for users to appeal content moderation decisions
  • publishing annual transparency reports on their content moderation processes
  • the increased protection of the privacy and security of children
  • a responsibility to inform law enforcement authorities if they become aware of any potential criminal activity
  • making sure that their platforms are not designed in a way that manipulates or deceives users
  • being able to verify the identities of business users on online marketplaces
  • ensuring clear terms and conditions on their platforms

 

How can companies protect young people in line with the Digital Services Act’s requirements?

New protections for young people mean that platforms must ensure transparent online advertising. Part of this requirement means that there will be restrictions on the ads that are displayed to minors.

Tailoring advertising to individuals requires the collection of something known as “special categories of personal data”. Put simply, this is any data about an individual that could be used to identify them. Under the DSA, platforms will no longer be able to collect this data about young people and target ads specifically towards them.

Alongside this, if their platforms are accessible to children, companies must protect the safety of these users. They will need to implement heightened data protection measures for children. Platforms may need to adapt the language in their terms and conditions so that children can properly understand them. They may need to put parental controls into place so that parents and guardians can help protect their children from exposure to harmful content. Or they may need to create tools to allow young people to report content and receive tailored support.

The DSA also forbids “dark patterns”. This is a technique that some platforms use to influence the decisions of their users. For instance, some sites may try to persuade users to make unnecessary purchases or make it difficult for users to cancel online subscriptions.

But to put these age-appropriate measures in place, platforms face the challenge of assessing whether the user is a child.

Establishing a user’s age is a delicate process of balancing assurance with privacy. In most cases, platforms won’t need to know the exact age of the person trying to access their content. It’s even rarer that they will even need to know a user’s exact date of birth. In many cases, it’s likely they’ll only need to know if the user is over or under a particular age threshold. The less information the user needs to share with a platform, the more their privacy is protected.

Users should also be given a choice of age assurance methods. Since many young people don’t own identity documents, offering alternative methods is vital for inclusion. Alongside the common method of using an identity document, platforms could consider accepting Digital IDs and facial age estimation. These alternative methods are more privacy-preserving as users can prove just their age without sharing any other personal details or documents.

 

How can companies verify their business users in line with the Digital Services Act’s requirements?

Another goal of the DSA is to counter the spread of illegal goods. The Act requires companies to be able to identify those selling goods or services on their platforms. This will make it easier for platforms to identify the person responsible for each sale.

Under the DSA, this ‘Know Your Business Customer’ (‘KYBC’) obligation requires online marketplaces to:

1. Collect KYBC information on sellers – Depending on the requirements, this could include their name, contact details, a copy of their identity document, bank account details, a registration number if the seller is registered in a company register or similar public register, and a self-certification by the seller committing to only offer products or services that comply with the applicable EU law.

2. Verify this KYBC information – Platforms must have effective methods in place to verify a seller’s identity. They are required to make reasonable efforts to assess whether this information is reliable.

This applies to both new sellers and those who are already active on the platform. If a seller cannot effectively prove their identity, the platform is required to suspend the user from trading. 

Therefore, to know who is selling on online marketplaces, platforms need to have simple, frictionless methods in place so that they can effectively check identities at scale. Users should be able to verify their identity in a way that protects the security and privacy of their personal information.

One such method could be with a Digital ID app, which lets people share the necessary information, without having to upload identity documents online. Online marketplaces can also use official online databases that are freely available.

Offering people choice in how they prove their identity ensures these checks are inclusive for people who don’t own identity documents or are unable to access them.

If the platform becomes aware of a user selling illegal goods or services, the platform is obligated to contact the users who have purchased the product. They will be able to identify the seller and must offer buyers with options for redress.

 

Complying with the Digital Services Act

Florian Chevoppe-Verdier, Public Policy Associate at Yoti said, “The Digital Services Act is the first major piece of online safety legislation, echoing the impact of GDPR. It is poised to set the tone for industry standards, and it is likely other advanced economies and trade blocs will follow suit. While compliance poses challenges for businesses, trusted age assurance providers such as Yoti can make the online world safer. The societal benefits will be immeasurable and truly transformative.”

If you’d like to know more about how you can comply with the EU’s new Digital Services Act, please get in touch.

Keep reading

An image of a woman looking at a computer screen.

Preparing for the EU’s new AI Act

Artificial intelligence (AI) is changing our world at a speed that, just a decade ago, we never could’ve anticipated. As AI finds its way into our everyday lives, regulators are racing to catch up with its development. In response, last month, the EU voted to bring in the Artificial Intelligence Act, also known as the AI Act. The Act is expected to enter into force in May or June 2024. This blog looks at what the legislation means for businesses and how they can comply.   Why is there an AI Act? In recent years, it seems as though AI

7 min read
An aerial view of a child using a laptop.

US age verification laws for online platforms

From buying goods online to accessing crucial services, there are countless advantages to an increasingly digital world. But with this development comes the serious challenge of ensuring that users can safely navigate online environments. As young people are able to access the internet more easily than ever, it’s important to make sure that their online journeys are age-appropriate. According to a national survey, the average age at which children in the US first see pornography is 12, with 15% first seeing online pornography at age 10 or younger. In response to the evolving digital landscape, regulation is making strides to

8 min read

Understanding the Kids Online Safety Act (KOSA)

From the UK’s Online Safety Act to Europe’s Digital Services Act, we’re in an era of increasing online safety regulation. In the US, the Kids Online Safety Act (KOSA) is a significant piece of legislation, currently making its way through Congress.  This blog looks at some of the requirements of KOSA and what this would mean for companies. What is the purpose of KOSA? First introduced in February 2022 by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), KOSA aims to protect children from harm online. It would require platforms to limit addictive features, allow young people to opt out

6 min read