Regulation
Online Safety Act becomes law
After years of debate and discussion, the Online Safety Act is now law – marking a new chapter in online safety. There are three key elements within the Online Safety Act that we are ready to help with: Age assurance to help platforms create safe, age-appropriate experiences online User verification to give users more control over who they interact with online Over 18 consent from content creators for the publication of intimate images Age assurance in the Online Safety Act The Online Safety Act is not about excluding children from the internet. It’s about giving them an experience
Understanding age assurance in the Online Safety Act
The Online Safety Act covers a wide range of issues including minimising the risk of children seeing harmful and age-inappropriate content, removing illegal content like child sexual abuse material (CSAM), criminalising fraudulent and scam ads, and introducing age verification for certain online services. This blog looks at some of the age requirements in the Act and what this means for tech companies, adult sites, gaming companies, social media platforms and dating sites. What is the purpose of the Online Safety Act? According to the UK government, the Online Safety Act aims to make the UK ‘the safest place in
Australia’s new National Self-Exclusion Register
From 21st August 2023, Australians will be able to ban themselves from all online wagering companies. “BetStop”, the National Self-Exclusion Register (NSER) will let people exclude themselves from all licensed online wagering operators, for a minimum of 3 months and up to a lifetime. The move aims to protect vulnerable people and reduce problem wagering. The BetStop NSER will be managed by the Australian Communications and Media Authority (ACMA). ACMA released a report which found that 11 per cent of Australians had participated in online wagering in the past six months – up from 8% in 2020. What does
Asking the FTC to approve facial age estimation for verifiable parental consent
Together with the Privacy Certified program of the Entertainment Software Rating Board (ESRB) and SuperAwesome, we are asking the Federal Trade Commission (FTC) for approval to implement facial age estimation as an authorised method for verifiable parental consent (VPC). The Children’s Online Privacy Protection Act (COPPA) requires companies to ensure they are not collecting personal data from children under the age of 13 without a parent’s consent. Currently, the COPPA Rule enumerates seven, non-exhaustive methods that enable parents to verify their consent. These include verification by government ID, credit card transaction, and a method that involves facial recognition, which is
UK games industry publishes new guidelines for Loot Boxes
Loot Boxes, found in certain video games, give players the opportunity to receive random items. They can be purchased with real or virtual money or through gameplay. Loot Boxes are a lucky dip; the player doesn’t know what item they will receive. They might unlock new levels or give the players access to special characters, equipment and weapons. Loot Boxes can add an element of excitement to the game. But concerns have been raised that the very nature of receiving a surprise item can be addictive. That Loot Boxes could encourage and be a pathway to problem gambling. Following these
The Age Appropriate Design Code for businesses
The Age Appropriate Design Code (also known as the “Children’s Code”) is the first statutory code of practice for children’s data in the world. Introduced by the UK Information Commissioner’s Office (ICO) in 2021, the set of standards seeks to ensure that online services are designed in the best interests of a child. “The best interests of a child” is a concept from the United Nations Convention on the Rights of the Child (UNCRC), which recognises that children need special safeguards and care in all aspects of their life. In a world first, the Code extends this protection to