US age verification laws for online platforms

profile picture Amba Karsondas 8 min read
An aerial view of a child using a laptop.

From buying goods online to accessing crucial services, there are countless advantages to an increasingly digital world. But with this development comes the serious challenge of ensuring that users can safely navigate online environments.

As young people are able to access the internet more easily than ever, it’s important to make sure that their online journeys are age-appropriate. According to a national survey, the average age at which children in the US first see pornography is 12, with 15% first seeing online pornography at age 10 or younger.

In response to the evolving digital landscape, regulation is making strides to catch up with these changes. In recent months, this has dramatically sped up, with the vast majority of US states now in the process of enacting new age verification laws.


What is the purpose of US age verification bills?

Age verification laws have emerged in the US as a response to concerns about minors’ access to adult content online. Introducing age verification aims to protect children and create a safer online environment.

This particular type of legislation falls under state law. This means age verification bills and their requirements vary from state to state. Some states have fully enacted bills, some have introduced bills to the state’s legislative bodies and others are yet to propose bills. 

In some states, age verification requirements have been incorporated into wider children’s codes. In others, there is more specific legislation focused solely on age verification.


What do US age verification bills mean for businesses?

Across the US, several states are passing laws that require platforms to know the ages of their users. This means that users will have to prove their age before they can access websites which host certain types of content. Under this legislation, platforms need to integrate robust age assurance methods for frictionless yet secure verification.

Depending on the state, businesses may face legal consequences if they fail to comply with legislation. This includes financial penalties, civil liability and restrictions on their operations in those areas.

Amongst the most significant of the regulations passed are those in Louisiana and California. These acts have set the precedent for subsequent laws. 


Louisiana Age Verification Law

The Louisiana Age Verification Law (or Act 440 of 2022) was the first law of its kind in the US. Enacted in 2023, it set the national standard for laws that protect young people from adult content. Following its enactment, several copycat laws were passed across other states including Arkansas, Florida, Kansas, Mississippi, South Dakota, Texas, Virginia and West Virginia.

The Act concerns the publication and distribution of online material which could be deemed harmful to minors. If at least one-third of a platform’s content is adult or pornographic material, the platform must establish each user’s age.


How can platforms verify age under the Louisiana Age Verification Law?

In Louisiana, age verification can be performed:

  • with a digitised identification card
  • through a system that uses government-issued identification OR relies on public or private transactional data, such as mortgage records or educational records

To protect users’ privacy, commercial entities must not keep any identifying information on the individual after performing the age verification.


California Age-Appropriate Design Code Act (CA AADC)

Alongside standalone age verification acts, some states have introduced laws with a much wider scope. The California Age-Appropriate Design Code Act (or AB 2273) covers age verification laws for adult sites alongside other measures designed to protect children’s privacy.

The Act was signed into law in 2022 and is due to take effect from July 2024. It extends beyond the reach of the federal Children’s Online Privacy Protection Act (COPPA).

The CA AADC concerns online platforms that are likely to be accessed by children. These sites must have their privacy settings set to the highest level by default. It also requires them to conduct a Data Protection Impact Assessment (DPIA).

A DPIA details how the business will use children’s personal data and assesses any associated risks. Platforms will need to review and update their DPIA every two years and must be available to the California Attorney General upon request.

To create age-appropriate experiences, platforms must have a secure way to determine the age of their users. The CA AADC obligates platforms to take a risk-based approach to age assurance. Businesses must assess whether the risks of processing a larger amount of data are proportionate to the level of certainty that they need of a user’s age. This means that platforms should only process more data about the user if it is absolutely necessary to be more exact about their age.


How can platforms verify age under the CA AADC?

The CA AADC allows for three main approaches to establishing age. Platforms can use them individually or in combination, depending on whether further age checks are needed. These are:

  • Self-declaration: this asks users to state their age or date of birth. Though it is quick to do, people can easily give a false answer. As such, the CA AADC only allows this method in cases that are very low risk.
  • Age verification: this is any method that can determine the exact age of a user. Verification can be performed against a document such as a passport, or against third-party databases such as through a credit card check.
  • Age estimation: this method uses technology to accurately estimate a person’s age. The technology detects a live human face, analyses the pixels in the image and gives an age estimate. Effective facial age estimation doesn’t require the user to have any documents and cannot identify an individual.


How can businesses comply with US age verification laws?

Platforms should establish who is accessing their content and which laws apply to them, both at a federal and state level. It is important to note that even if a business is not based within a particular jurisdiction, it may still be obligated to follow its legislation.

Until recently, users could meet age verification requirements by self-declaring their age. In many cases, this was as simple as ticking a box marked ‘over 18’. However, under new age verification laws, platforms need to conduct more robust age assurance checks. These methods should be accurate, secure and accessible.

There are three main ways that we can help platforms to verify the ages of their users. The most well-known of these processes is age verification using an identity document. Alongside this, platforms can use Digital IDs or facial age estimation, if accepted under the relevant legislation. These alternative methods are privacy-preserving for the user who can prove their age without sharing any other personal details.

As of January 2024, nearly 2.6 million adult US citizens did not have any government-issued photo identification. In addition, 21 million voting-age US citizens don’t have a valid driving licence. To ensure that their services are available to as many people as possible, platforms should offer a range of age assurance methods to their users.


An evolving regulatory landscape

The rate at which age verification laws are changing in the US is unprecedented. Multiple bills are currently going through various states’ legislative bodies. As regulation attempts to keep up with the speed of technological advancement, businesses should seek regular legal advice to assess whether they are compliant with local laws.

After establishing which age assurance methods are accepted, platforms should integrate a choice of age checks for their users. When done effectively, robust and accurate age verification can play a pivotal role in building a safer internet for young people. 

If you’d like to know more about how your business can comply with US age verification laws, please get in touch.


More resources

For the most up-to-date information on US age verification bills, the following resources may be helpful:

Keep reading

How age assurance builds trust and safety on gaming platforms

There is a growing agreement that more needs to be done to improve online safety. Regulators around the world are introducing new laws to make the digital world safer and ensure young people have an age-appropriate experience online.  With legislation such as the Age Appropriate Design Code, the UK’s Online Safety Act, and the EU’s Digital Services Act reshaping the industry, gaming companies are facing a new era of accountability and responsibility. From implementing age assurance measures to ensuring age-appropriate content and experiences, gaming companies must navigate the regulatory landscape while prioritising user safety and privacy.  This blog explores some

9 min read
An image of a woman looking at a computer screen.

Preparing for the EU’s new AI Act

Artificial intelligence (AI) is changing our world at a speed that, just a decade ago, we never could’ve anticipated. As AI finds its way into our everyday lives, regulators are racing to catch up with its development. In response, last month, the EU voted to bring in the Artificial Intelligence Act, also known as the AI Act. The Act is expected to enter into force in May or June 2024. This blog looks at what the legislation means for businesses and how they can comply.   Why is there an AI Act? In recent years, it seems as though AI

7 min read
Girl relaxing on sofa using smartphone

Understanding the Kids Online Safety Act (KOSA)

From the UK’s Online Safety Act to Europe’s Digital Services Act, we’re in an era of increasing online safety regulation. In the US, the Kids Online Safety Act (KOSA) is a significant piece of legislation, currently making its way through Congress.  This blog looks at some of the requirements of KOSA and what this would mean for companies.   What is the purpose of KOSA? First introduced in February 2022 by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), KOSA aims to protect children from harm online. It would require platforms to limit addictive features, allow young people to opt

6 min read