Why are child safety groups asking for age verification on porn sites?
They fear that it is too easy for children to access publicly available pornography online. Experts who work with children say pornography gives children an unhealthy view of sex and consent, putting them at risk from predators and possibly preventing them from reporting abuse.
It can also cause children to behave in ways that are risky or inappropriate for their age, harming themselves and others. Charities say children tell them pornography is hard to avoid and can leave them ashamed and distressed. One of the concerns is the extreme nature of pornography on mainstream sites, with a study showing that one in eight videos seen by new visitors showed violent or coercive content.
An investigation by the British Board of Film Classification Last year, 60% of children aged 11 to 13 who said they had seen pornography said it was largely unintentional. Ofcom research found that commercial pornography site Pornhub – which does not use age verification – had a larger UK audience than BBC News. It was visited by 50% of all men and 16% of all women in the UK in September 2020. Three of the most popular sites in the UK – PornHub, RedTube and YouPorn – are owned by a Canadian company, MindGeek .
Last December, Mastercard and Visa announced that they would block customers from using their credit cards on Pornhub following accusations the porn site showed videos of child abuse and rape. A New York Times investigation alleged that the site hosts revenge pornography taken without the consent of participants. Following the accusations, Pornhub owners Mindgeek have been removed millions of user-generated videos uploaded by unverified site users. Pornhub has firmly denied all the accusations, although they have taken far-reaching measures to “protect” the site.
How are âage assuranceâ and âage verificationâ different?
Age assurance describes the methods used by businesses to determine the age of an online user, such as self-declaration (i.e. a pop-up form); profiling (determining a user’s age by scanning the content they consume or the way they interact with a computer) and biometric details such as facial analysis.
Age verification uses a piece of identification known as âfixed identifiers,â such as passports or credit cards. The NSPCC, the child protection charity, wants this to be mandatory for accessing high-risk sites, such as commercial pornography or dating sites.
What regulations exist on age assurance and verification?
The UK’s data watchdog, the Information Commissioner’s Office (ICO), introduced the Children’s Code (or age-appropriate design code) in September. It is designed to prevent websites and apps from misusing children’s data, for example by using ‘nudge’ techniques to make kids spend more time online or by creating a profile of them. then used by algorithms (instructions for computers) that can direct children to dangerous content.
If a website or app recognizes that its content may be risky to children, it needs to manage that risk and one method is to use the age guarantee. John Carr, from UK Coalition of Children’s Charities on Internet Safety, argues that pornographic sites should require age verification because they are “likely to be viewed by children.” The ICO believes that pornographic sites are not intended for children and therefore to do not fall under code designed to make Internet services safer for children.
Ofcom has regulations for video sharing sites that have their European headquarters in the UK, such as TikTok, Vimeo and Snapchat (YouTube, for example, is not based in the UK). These sites are legally required to protect those under 18 from harmful video content, and Ofcom guidelines in October said that platforms hosting porn “should have strong age verification.” Users of online gambling sites, which are prohibited for those under the age of 18, are also required to “affirm that they are of legal age to play”.
What does the Online Safety Bill propose regarding age verification?
The Online Safety Bill (OSB) emphasizes protecting children from harm online, whether through exposure to online pornography or viewing other harmful content. It applies to companies that produce “user-generated content” – Facebook, Twitter and YouTube – but also to commercial pornographic sites.
The bill imposes a duty of care on technology companies: to prevent the proliferation of illegal content such as images of child sexual abuse; and to ensure that children are not exposed to harmful or inappropriate content. Insurance or age verification is an obvious way to control age.
But the bill does not require age verification for sites that could expose children to harmful content. Instead, the regulator, Ofcom, may recommend that certain sites such as porn sites may require age insurance or verification. Businesses could be required to provide a risk assessment, including whether they expose children to harmful content, and to suggest ways to mitigate those risks. Ofcom will then determine whether the company has put in place appropriate measures to protect children or whether it is failing in its duty of care. Ofcom can then order the use of age assurance and verification measures.
Is there any alternative legislation pushing for age verification?
Yes, crossbench peer Beeban Kidron, architect of the ICO’s children’s code, introduced a private member’s bill to the House of Lords: the age insurance bill (minimum standards). It sets out a framework for basic online age verification standards. It could end up in the OSB bill if, for example, Ofcom – which will implement the bill – is empowered to introduce new standards.