Current Region:
Global

The UK Online Safety Bill

June 17, 2021

AVPA Co-Chair, Alastair Graham, CEO and  Founder of AgeChecked, reviews the latest piece of legislation being presented to the UK Parliament…

Last month, the UK published its long awaited Online Safety Bill.  The Bill creates a new duty of care on websites and search engines which allow for user generated content to be shared, and essentially targets social media.  Debate on the draft bill has focused both on what is included and what has been excluded.  While some concerns have been raised about the limits to the scope of this legislation, its impact on a very large number of websites, apps and games globally should not be underestimated.

Search engines do not require much definition, so it is the other category which is termed  a “user-to-user service”, which merits more explanation.   Such services are defined as those which allow for content (including written material or messages, oral communications, photographs, videos, visual images, music and data of any description) that is generated by a user, or uploaded to or shared on the service by a user, to be read, viewed, heard or otherwise experienced by another user.

The new law applies to services wherever they are located around the world, as long as they have a significant number of UK users or the UK is a target market, or, even where these first two criteria do not apply, the service poses a material risk of significant harm to people in the UK.

At this point, it is worth mentioning that there is an exemption for websites with limited functionality which only publish comments or reviews about material the site itself has generated.  We understand this to be designed to allow websites which sell goods to invite their customers to submit reviews, click like or dislike, or add an emoji to share their thoughts with other customers.  As written, this exception could be applied rather more widely so it will be interesting to see how the legislation is amended as it progresses through Parliament when other examples are considered e.g. a pornographic site that produces and publishes all of its own videos.

For all the sites which are caught by this legislation, there is a requirement to comply with several new, specific duties.  These address the risk of illegal content, the protection of freedom of expression and privacy, allowing users to report issues and seek redress, and keeping records and reviewing ongoing compliance.    So, for example, services must assess the risk that they might carry terrorist or child sexual exploitation and abuse content.  They must also allow for users to alert them to illegal or harmful content or if they feel that their freedom or privacy is being inhibited unfairly, or if they believe they are being negatively impacted by the protective measures being applied.

There is a counterbalancing duty to have regard to the importance of protecting freedom of expression within the law, and another which warns services against unwarranted infringements of privacy as a result of the policies and procedures they may introduce in order to comply with the other duties.  To some extent, this may be to alleviate concerns raised under previous attempts to address online harms such as the Digital Economy Act 2017 where there was less attention paid to protecting privacy within the statute, and this is welcome.

There are additional duties if a site or search engine is likely to be accessed by children.  This is the same test that is used by the Information Commissioners Office when applying the Age Appropriate Design Code, so is a welcome step towards consistency across the regulatory ecosystem.  The judgement includes both the likelihood of children accessing a site or the actual number of children who do access it.

For these sites, the operators will need to consider whether there is a risk that children will be exposed to content defined as harmful by the Secretary of State with the approval of Parliament.  The minister will define two categories: ‘primary priority content’ and ‘priority content’ although content which is not listed by the minister,  should also be considered if it may pose a threat to the well-being of children.

Each of the duties so far mentioned are considered in the context of the degree of protection already built into a service, and the response by the operator need only be proportionate to the level of risk.

For the largest services, which will be designated as “Category one”, there is an additional requirement to conduct an adult risk assessment and to protect both democracy and journalism.  Notably, there is another exception within the Bill to allow for the sharing of news through search engines provided this is from a recognised news publisher and is reproduced in full.

Once the Bill is passed into law, it will not take effect until Ofcom as the regulator has prepared a code of practice for any given duty. Ofcom is required to consult very broadly with stakeholders, including experts in public health and technology, as well as those who have suffered harm online.   Each code must be at a sufficiently detailed level that providers understand what is expected of them in practice.  The steps outlined must be proportionate a feasible, given the size of the operator.  The Bill also sets out ‘online safety objectives’ for the services within the scope that will guide the creation of the codes of practice. Notably these include providing a higher standard of protection for children than for adults, and for the different needs of children at different ages to be taken into account.  Once a code of practice has been approved by Parliament will it come into force 21 days later, only then allowing for Ofcom to enforce the duty to which it refers.  There is a concern that this  process could take months or even years.

We described above the global reach of this new legislation which therefore requires the ability to enforce against non-compliant services, wherever they are based.  This is achieved through service and access ‘restriction orders’ which Ofcom can obtain from a court if a service has refused to comply with its notices.  There is also the power to impose a fine of up to £18 million or 10% of worldwide revenue, whichever is the larger.

To inhibit access to a service, Ofcom will be given powers to compel action by any facility that is able to withdraw, adapt or manipulate its operations to impede access to an offending service.  This includes internet access services and, innovatively, app stores.

It is worth pointing out that the government promised that this bill would deliver the same objectives as Part 3 the Digital Economy Act, which, in 2018,  Ministers decided against implementing.  Part three famously introduced age verification for pornographic websites.  The substantive text of the new Bill only includes the word ‘pornography’ once and that is for the purpose of repealing Part 3, so it is not obvious how this will be achieved unless the adult site in question continues to support user generated content.  This significant loophole did not go unnoticed, and speaking just one day after the Bill’s publication, the Secretary of State said he was open to accepting an amendment to extend the protections offered to children in this regard, while it was going through a process of pre-legislative scrutiny which is due to start in the next few days.

For age verification providers, this legislation is going to increase demand for our services because it is simply not possible to apply an extra level of protection for children online without knowing which users are under 18.  Indeed, in aligning itself with the ICO Children’s Code, there will be an expectation that many websites also know the age-ranges of children on their site, so the youngest kids can be offered the highest level of protection.  AV Providers already offer highly effective, independent,  standards-based age verification at 18, and are developing proportionate solutions to facilitate more granular protection of children of different ages in time to support both these key pieces of legislation, as we work with government and regulators to make the internet safer space for children, preserving the freedoms of adults while protecting the privacy and data of users of any age.

Privacy; a foundational concept for age verification

Privacy; a foundational concept for age verification

Perhaps the most frequent concern raised about age verification is a risk to privacy.  But the essence of age assurance is the ability to prove your age online WITHOUT disclosing your identity.  Our industry would not exist were there not the absolute need to preserve...