Being able to access the online world is a right for all, but it is not right that all who are online can see everything – not all at once anyway. A childhood, for example, is supposed to be a journey of discovery, where each day brings some new titbit of information much like an advent calendar releases treats in the run up to Christmas. Sadly, that concept of ‘childhood’ is under threat from the current state of online regulation which threatens to turn it from an advent calendar into a situation more akin to throwing a toddler into a swimming pool-sized box of (unwrapped) Quality Street.
In just a few clicks and without many (if any) checkpoints, young children are able to see graphic pornographic material, advice that encourages eating disorders and self harm, excessive violence and race hate material. This is particularly true for those children who have been granted ownership of a smartphone.
Some of the inappropriate content listed above is illegal, and is pursued by the right authorities. Others are entirely legal but unfortunately not well regulated enough to guarantee that vulnerable, unprepared eyes don’t see it.
(Preparation is a key point here: we must teach our children to become critical thinkers online – to be aware of the different dangers and how to react should they come across it. Both parents and schools should become leaders in this space)
But education and preparation can only take us so far. Take pornographic material, for example:
The UK government is aware and unimpressed by these numbers – led by calls from David Cameron, it has pledged to ban pornographic websites if they cannot guarantee the age appropriateness of their visitors.
Well-meaning rhetoric, but what is the practical implication of this ruling for business owners and consumers? If a resource-rich giant like Facebook can’t keep out under-age users, how can smaller companies do it? And what will the ruling mean for websites that are operated from overseas?
One option is to place the responsibility with the ISPs. Another is to force the requirement for credit card details at the point of entry. But it is fairly clear that a system that requires the manual reporting, blacklisting and whitelisting of sites is unsustainable, and relying on third party checks (e.g. through financial institutions) could prove to be a costly process for website owners, not to mention a much worse user experience for website visitors.
Without a viable solution, it is not impossible to see a blanket ban on websites featuring adult sexual content coming into force – other countries have gone this way. But that whiffs of failure, rather than solution (and would probably end up on the losing side of a larger discussion on net censorship). A real solution would have to take into account modern consumer behaviour i.e. mobile, multi-platform, their right to privacy AND website owners need for a cost-effective, user-friendly process. To learn more about the work that Yoti is doing in this space, check out our online age checking system.
By Alex Harvey
Ask me anything: @alextharv