On 12 May 2021, HM Government published Command Paper 405, a draft of the Online Safety Bill to allow a joint committee of members from both Houses of Parliament to undertake pre-legislative scrutiny. This committee is expected to complete its work by the summer recess, allowing a revised Bill to be introduced for formal consideration by Parliament in the autumn, with Royal Assent targeted before the end of the current session (spring 2022).
The Bill introduces a ‘duty of care’ which itself comprises a number of specific duties on organisations which offer ‘user-to-user services’ online. These services are defined as allowing one user to read, view, hear or otherwise experience (‘encounter’) content (including written material or messages, oral communications, photographs, videos, visual images, music and data of any description) from another user.
It has long been claimed by Ministers that the Bill will deliver the objectives as Part 3 of the Digital Economy Act 2017, notably requiring age verification to access online pornography. On the face of the draft text, it is not immediately clear how this will be achieved given that the term ‘pornography’ is used just once in the substance of the Bill, and that is only in order to repeal Part 3.
Where an adult website includes functionality that facilitates user-to-user services, it will be captured by the duty of care and is very likely to be required by the new regulator, Ofcom, to apply age verification. But the initial draft of the Bill would not include in scope commercial pornographic websites which either do not currently offer user-to-user services, or may choose to remove this functionality in order to escape such a requirement.
Speaking the day after the publication of the Bill to the DCMS Select Committee, the Secretary of State made clear he was concerned that the Bill might become ‘a Christmas tree Bill’, with a large number of additional provisions being added to it like baubles. However, when challenged about the concerns that pornography is associated on domestic violence towards women and girls, he volunteered that the Government is open to Parliament suggesting, through the pre-legislative scrutiny process, an amendment. He said he is “happy to find a commensurate way of providing wider protection for children… that is one bauble I might be open to hanging on the Christmas tree” as “there is a strong case for” extending the Online Safety Bill to include commercial pornographic websites.
It would appear to be relatively straightforward to add alongside services which offer user-to-user interaction and search engines, a third category of commercial pornographic websites, drawing on the definition in the Digital Economy Act so that these sites are also captured by the various duties defined in the Bill. It would also be wise to amend to the enforcement mechanisms which currently rely on an application being made by Ofcom to the court and are therefore wholly inadequate for dealing with over 1,000,000 adult websites. It is also worth reviewing the Digital Economy Act for other measures which are not yet replicated in the Online Safety Bill but merit being reinstated – for example, the power for the regulator to block extreme pornography.
The new Bill already includes a duty to protect the privacy of users, and not put this at risk through any measures taken to apply the safety duties. While the BBFC put in place a certification scheme to protect privacy and ensure the highest standards of data security, it was an omission from the DEA which the new Bill does not repeat, which is very welcome. A well regulated, independent, standards-based, open and competitive market for age verification will allow websites, apps and platforms to know the age (but not the identity) of their users to a level of certainty proportionate to the risk of harm its content presents.