Asking the FTC to approve facial age estimation for verifiable parental consent

profile picture Yoti 3 min read
Father and son using devices together

Together with the Privacy Certified program of the Entertainment Software Rating Board (ESRB) and Kids Web Services Ltd, a wholly owned Epic Games subsidiary, we are asking the Federal Trade Commission (FTC) for approval to implement facial age estimation as an authorised method for verifiable parental consent (VPC). The Children’s Online Privacy Protection Act (COPPA) requires companies to ensure they are not collecting personal data from children under the age of 13 without a parent’s consent. Currently, the COPPA Rule enumerates seven, non-exhaustive methods that enable parents to verify their consent. These include verification by government ID, credit card transaction, and a method that involves facial recognition, which is different from what we propose in our application.

We’d like the FTC to authorise facial age estimation as another VPC method to give parents and operators more choice in how parents can prove their age and grant consent. We are not seeking approval to implement this technology to check if children are old enough to purchase, download, and/or play a video game. Children are not involved in the age estimation process at all, which is designed to confirm that parents are adults, as required under COPPA.

We’d like to explain how the FTC application seeks to implement the technology in a way that is consistent with COPPA’s requirements for data minimisation, confidentiality and security.

  • Facial age estimation provides an accurate, reliable, accessible, fast, simple and privacy-preserving method to ensure that the person providing consent is an adult.
  • Facial age estimation is not facial recognition; it estimates a parent’s age without identifying them. It does so by converting the pixels of a facial image into numbers and comparing the pattern of numbers to patterns associated with known ages.
  • Facial age estimation does not create a database of faces. It doesn’t learn the user’s name, identity or anything about them. It does not scan their face against a database.
  • The technology is inclusive; it requires no collection of identity or payment card information.
  • It is accurate and does not show material bias among people of different skin tones. We have done extensive testing based on millions of facial scans and publish the accuracy levels transparently.
  • In our proposed application, facial age estimation is always presented as an option to parents alongside other approved methods of verification, providing the parent with a choice of methods.

If you have any questions about our FTC application, please get in touch.

Keep reading

How accurate can facial age estimation get?

Facial age estimation using machine learning has advanced significantly in recent years. But, a common and fair question still arises: How accurate can it really be? Can a system look at your face and accurately guess your age, especially when humans often get it wrong? The short answer is that it’s very accurate – but not perfect. We explain why.   The myth of 100% accuracy It’s important to set realistic expectations. No facial age estimation model can achieve 100% accuracy across all ages.  Human aging is highly individual and shaped by many external factors, especially as we get

6 min read
Synthetic identity fraud is committed by the theft of a real piece of persoanl information such as an SSN, and combined with false information to make up an entirely synthetic identity that often bypasses traditional checks

What is synthetic identity fraud? How it works and how to prevent it

What is synthetic identity fraud? Synthetic identities are fake identities, built by combining real and made-up information, earning them the nickname “Frankenstein IDs” due to their pieced-together nature. Synthetic identity fraud is different to traditional identity fraud as it doesn’t involve an obvious, immediate consumer victim. These fake profiles are designed to mimic real customers, often slipping past traditional fraud detection systems because they don’t raise typical red flags. As a result, the primary victims of synthetic identity fraud are businesses and lenders, who bear the financial losses.   How synthetic identities are created and used Fraudsters combine

8 min read
Woman presenting a 2d image trying to perform a presentation attack

Why early detection is critical in stopping deepfake attacks

Digital identity and age verification are becoming integral parts of customer onboarding and access management, allowing customers to get up and running on your platform fast. However as customer verification tools become more advanced, so too are fraudsters seeking to spoof systems by impersonating someone, appearing older than they really are or passing as a real person when they’re not. Deepfake attacks, which can mimic a person’s face, voice or mannerisms, pose a serious threat to any business using biometric customer verification. In this blog, we explore why detecting deepfakes early is essential for maintaining trust, security and regulatory

6 min read