Understanding the Kids Online Safety Act (KOSA)

profile picture Rachael Trotman 6 min read

From the UK’s Online Safety Act to Europe’s Digital Services Act, we’re in an era of increasing online safety regulation. In the US, the Kids Online Safety Act (KOSA) is a significant piece of legislation, currently making its way through Congress. 

This blog looks at some of the requirements of KOSA and what this would mean for companies.

What is the purpose of KOSA?

First introduced in February 2022 by Senators Richard Blumenthal (D-CT) and Marsha Blackburn (R-TN), KOSA aims to protect children from harm online. It would require platforms to limit addictive features, allow young people to opt out of algorithmic recommendations, and restrict access to children’s personal data. 

President Biden has been calling for legislation to improve online safety for young people for the past three years. He first spoke about online harms to young people in his State of the Union address in 2022, pushing for new privacy protections for children online. The president has continued to champion the issue and spoke about it again in his 2024 State of the Union Speech recognising, ‘there’s more to do, including passing bipartisan privacy legislation to protect our children online’. 

Who does KOSA apply to?

The bill would set a duty of care to protect ‘minors’ (who are defined as children under 17) on certain online platforms – including social media, online video games, virtual reality worlds, online messaging and video streaming sites. The bill however exempts some entities and services like email and internet service providers, and educational institutions. Regulated or ‘covered’ platforms would need to limit addictive or harmful features, implement the highest privacy settings by default and develop more robust parental controls. 

The Federal Trade Commission (FTC) would be responsible for the enforcement of the requirements of the bill. Companies which do not comply, or which fail to prevent or reduce harm online, would face significant lawsuits. State attorneys generals would be able to enforce certain parts of the law, including its provisions on safeguards for minors, disclosure and transparency.

Has KOSA been signed into law?

KOSA was updated and passed unanimously out of committee in July 2023. It has bipartisan support, with 62 senators now backing the bill, and is currently going through Congress. 

A number of civil liberties advocates and opponents of the bill have raised concerns; primarily around privacy, censoring free speech and limiting access to essential information, including LGBTQ+ related topics. But the bill’s supporters say it is not designed to target specific types of content – rather the design behind algorithms that recommend content.

An update to the bill in February 2024 aimed to mitigate the concerns around free speech and access to information. The updated wording includes a specific definition of a “design feature” – something that will encourage minors to spend more time and attention on a specific platform, such as infinite scrolling, notifications and rewards for staying online. By focusing on these types of features, supporters of the bill will focus on the design of the platform rather than the content they host.

What are the current requirements of the bill?

The central focus of KOSA is to introduce a duty of care to prevent and mitigate risks faced by users under 17s. Platforms will need to reduce potentially harmful content, including content on self-harm, anxiety, depression and eating disorders. Platforms must activate the highest levels of privacy and safety settings by default for users under 17, and limit certain design features such as infinite scrolling and rewards for staying online.

Platforms would also have to implement the most restrictive privacy and safety settings by default. This has some similarities with the UK’s Age Appropriate Design Code, and would restrict platforms from collecting personal data. Regulated companies would also need to make it easy for users to limit who can contact them and who can see their information, as well as allowing parents or guardians access to children’s privacy and account settings.

The current wording of the bill states that platforms must also:

  • disclose specified information, including details regarding the use of personalised recommendation systems and individual-specific advertising to minors
  • allow parents, guardians, minors and schools to report certain harms
  • refrain from facilitating advertising of age-restricted products or services, such as tobacco and gambling, to minors

For platforms with more than 10 million monthly active users in the US, they would also need to report annually on foreseeable risks of harm to minors and what mitigation steps the platform has introduced.

Are there any age verification requirements of KOSA?

The age requirements in the bill have been updated since it was first introduced. KOSA does not require platforms to implement age verification. Instead, they can use a basis of ‘objective circumstance’. This is based on the assumption that social media platforms already have enough empirical data about a user to make a reasonable estimate about their age. If a platform can reasonably state that a child is using their platform, they have to take steps to proactively protect them. 

Platforms are likely to interpret this wording differently, but guidance from the FTC would be expected on this. Considering the risk of wrongly estimating the age of a user, some companies will still choose to introduce new, additional measures and technology to verify the age or age range of users if, for example, they have 18+ content on their platform. 

The bill also includes parental consent measures. A platform needs to obtain ‘verifiable parental consent’ for minors to create social media accounts, with safeguards and parental control settings.

Whilst the age requirements in KOSA have been updated, it is worth noting that it is possible to complete data minimised age checks online without risking privacy. There is no need to ask users to upload copies of identity documents or share lots of personal information. It is also possible for parents to grant parental consent, using FTC approved approaches. 

What’s next for KOSA?

Despite unanimous passage out of committee and over 60 supporters in the Senate, there is still a way to go until KOSA becomes law. It still needs a companion piece of supporting legislation introduced into the House of Representatives – something which has not yet been started. Despite the note from Biden in the 2024 State of the Union Speech recognising, ‘there’s more to do to, including passing bipartisan privacy legislation to protect our children online’, there is also the risk that the bill completely stops if it has not been passed by the time of the November 2024 Congressional elections. 

With other countries implementing their own online safety regulations, such as the UK’s Online Safety Act and Europe’s Digital Services Act, we will be watching closely to see how KOSA progresses.

We will continue to update this blog as KOSA evolves and progresses through Congress. Keep up to date with the latest news by following us on LinkedIn.