Age assurance
How Yoti’s facial age estimation is used across different industries
Checking users’ ages has never been more critical for businesses catering to diverse audiences. However, they’re faced with the challenge of effectively verifying the ages of their users whilst maintaining seamless and user-friendly experiences. Yoti’s facial age estimation is a secure, privacy-preserving way to do just that. Our technology is used across a variety of industries, both online and in-person. This includes retail, social media, dating, gaming, gambling and financial services. In this blog, we explore how businesses are using facial age estimation to create safer, more positive experiences for their users. What is facial age estimation? Facial
Helping Instagram to create safer online experiences with new Teen Accounts
From today, Meta is introducing new ‘Teen Accounts’ on Instagram for users under the age of 18. This change aims to help parents keep their teens safe online, by including features that have built-in protections. These include the ability to set daily usage limits, restrict access during certain hours and monitor their child’s interactions, such as the accounts they are messaging and the types of content they’re engaging with on the platform. New users under the age of 18 are, by default, given the strictest privacy settings. Under the new guidelines, teens aged between 16 and 18 will be
Why Yoti’s facial age estimation is not facial recognition
There’s quite a bit of confusion about the differences between facial age estimation and facial recognition. While both types of technology work with images of faces, they’re used for different reasons and are trained in different ways. To help clear up some of these misconceptions, we’ve explained some of the key ways that our facial age estimation is not facial recognition. Facial age estimation vs. facial recognition: designed to give two different outcomes. Facial age estimation delivers an estimated age result. Facial recognition delivers a match (or no match) between images of a person. [vc_column_text
Facial Age Estimation white paper
Making it faster and safer to prove your age Our age estimation technology accurately estimates a person’s age by looking at their face. We built it to give everyone a secure and private way of proving how old they are in different everyday scenarios: from age checking on social platforms and online stores, to supermarket self-checkouts, bars and clubs. This privacy-friendly approach to age verification doesn’t require any personal details or documents, and all information is instantly deleted once someone receives their estimated age – nothing is ever viewed by a human. Key takeaways
Why do Yoti facial age estimation results published by NIST differ to those reported by Yoti in its white papers
In September 2023, we submitted our facial age estimation model to the US National Institute of Standards and Technology (NIST), as part of a public testing process. This is the first time since 2014 that NIST has evaluated facial age estimation algorithms. NIST age estimation reports are likely to become a globally trusted performance guide for vendor models. NIST assessed vendor Facial Age Estimation models using 4 data test sets at certain image sizes: NIST provides some example images: NIST note in their report that age estimation accuracy “will depend on
The importance of transparency for facial age estimation
To protect young people online, businesses need to provide age-appropriate experiences for their users. This could apply to online marketplaces, social media networks, content sharing platforms and gaming sites. But to put the correct measures in place, businesses need to know the ages of their customers. It was previously thought that the only way to confidently establish a user’s age was with an identity document, like a passport or driving licence, or checks to third-party databases such as credit reference agencies or mobile network operators. However, regulators are now recognising facial age estimation as an effective alternative. As with