Current Region:
Global

Response to the Free Speech Coalition submission to Ofcom

March 19, 2024

We were pleased to see the response from the Free Speech Coalition to Ofcom’s consultation on Part 5 of the Online Safety Act as it gives us the opportunity to clarify some potential misunderstandings and provide some reassurance, particularly to the many smaller adult websites that may be concerned that compliance will be far too costly and cause them to lose a substantial part of their audience.  Our message to them is that age assurance technology is now cheap, convenient and effective.

Question 1: Do you agree with our proposed guidance on scope?

We can start in agreement with the FSC, that the scope of Part 5 is confusing.  In our own response, we highlighted the ambiguity of “tube sites”, which could be as easily treated as user-to-user platforms as provider published sites.

Question 4: Do you agree that service providers should use the proposed criteria to determine whether the age assurance they implement which is highly effective?

We further agree with the FSC that Ofcom, not service providers, should determine what constitutes “highly effective age assurance.”  We agree with the FSC that “Ofcom needs to assess the options and provide a standard, as well as a list of methods that meet that standard“.  In our own response, we outlined a simple phrasing that specifies the percentage of false positives that would be tolerable (5%) and the percentages of false positives where the user was more than two years too young (1%).

Question 5: Do you have any information or evidence on the extent of circumvention risk affecting different age assurance methods?

We can provide reassurance to the FSC that fake ID documents, such as those recently reported by 404 Media, are detectable by our member’s technology. Both Au10tix and Yoti have recently published reports with evidence that almost all such fakes are currently being discovered.  And as an industry, we are not resting on our laurels.  Project DefAI is a partnership with the leading academics at the Idiap Institute in Lausanne, Switzerland which is devising new ways to detect and defend against both deepfake and AI attacks, and to test and certify those defences.  While we are not complacent, as you cannot be when it comes to any aspect on online security, we are still confident that “liveness tests” still report accurately when a user is synthetically created or altered.

Given the dynamic nature of the attacks and required responses, it may be best for the standard of defence that is considered sufficient to be determined by industry-set standards, not through regulations per se. But Ofcom could refer to relevant standards to provide greater clarity about their expectations.

Question 6: Do you agree with our proposed guidance that providers should consider accessibility and interoperability?

The FSC has reviewed the published deliverables of euCONSENT and set out its concerns in answer to this question.  The first point to clarify is that any problems reported by users during the euCONSENT pilots were not with the re-usability of age checks within the network.  The results of the test of interoperability were extremely successful.

There were some concerns reported with the user experience, and often the providers involved had designed their sites for use by adults not children given the nature of their existing client based was mostly sites selling alcohol, gambling and tobacco.  This was important feedback for those providers who have all re-designed their user-experience for sites where children may need to use their solutions to be child-friendly (not an issue an adult site needs to worry about in any case).

The FSC also highlights another concern we share – that without a level playing field of enforcement, users will defect from services that have complied to those which have not.  The British Board of Film Classification which was originally intended to regulate adult sites in the UK realised this risk early on, and has shared that lesson learned very clearly with Ofcom.  This will require a different style of enforcement from the usual Pareto method adopted by regulators in markets where substitution for other sites is less easy.  With 5 million adult sites to choose from, Ofcom is going to need to enforce at an industrial scale.  Ministers in both Houses of Parliament acknowledged this, and cited the Civil Procedure Rules for the courts which facilitate multi-party action, allowing Ofcom to apply for business disruption and blocking orders for multiple sites simultaneously.

We acknowledge the final concern the FSC raises in their answer to this question, with phishing.  All sensitive digital services are at risk of being replaced with fake sites, seeking financial, health or identity data.  It is a risk inherent to many aspects of our online life.  But there are a number of measures which significantly mitigate the risk of age assurance being used to steal personal data:

  1. Typically, users will not select an age assurance provider – they will be referred to it by a site they wish to access, so it will be for the site not the user to do the due diligence on the provider.
  2. As an industry, we promote the use of independent auditors to certify age assurance providers to the highest standards of accuracy, data security and privacy.  Users – or more often the platforms contracting age assurance services – can check with the auditors’ registries to confirm they are using a certified provider.
  3. It is likely that the first time age assurance is required is when opening a social media account at the age of 13.  These are generally the largest platforms which will be particularly careful in selecting which age assurance provider they use.
  4. As interoperability grows, first of all, users may not need to repeat the age check they first did to open a social media account with a global platform for many months.
  5. Also, to join an interoperability network, there will be further audits and certification required.

Question 7: Do you have comments on the illustrative case study we have set out in the guidance?

Again, we have some sympathy for the FSC’s concern that “guidance places 100% of the liability on the party least capable of mitigating it.”  We support co-regulation where Ofcom could recognise the use of certified age assurance solutions and offer safe harbour to those platforms which do so in good faith, even if there may then be imperfect outcomes.  Indeed, Ofcom will always apply a public interest test before taking enforcement action, and doing so against a site which has done the right thing and paid for a certified age assurance provider to check its users, would create a disincentive to good behaviour, so should not pass this test.

Question 10: Do you have any comments on the impact assessment set out in
Annex 1?

For this answer, the FSC has created a model to project the cost impact on sites of differing sizes.  The cost of an age check is assumed to be between 10 – 50 pence per check.  The UK government’s assumption is 12 pence, so this is a high estimate, even for smaller sites which may not benefit from volume discounts.  But more importantly, the calculations appear to assume that a fresh age check has to be paid for each unique user each month.  This does not allow for users who create an account which need only be checked when they open it.

Indeed, when considering the largest platform in the model, with 50m unique monthly users, that suggest that annually it has 600m unique users; ten times the population of the UK.  In fact, Ofcom’s figures from May 2023 suggest around 13.8m visitors to adult sites each month – mostly the same visitors who would visit in the other months of the year.  The FSC estimate that the largest sites make £25m revenue a month, so £300m a year.  Checking all 13.8m at 10 pence a check, once a year, would cost £1.38m. so less than half of one percent (0.46%) of their revenue.

The same logic may well apply to small sites, with the same 100,000 monthly users being the core audience, and therefore costing say £20,000 (if we assume the cost per check is doubled for lower volumes) against annual revenues of £250k x 12 = £3m, so just 0.66% of revenue.

We welcome this contribution to the debate and look forward to further discussions.

Privacy; a foundational concept for age verification

Privacy; a foundational concept for age verification

Perhaps the most frequent concern raised about age verification is a risk to privacy.  But the essence of age assurance is the ability to prove your age online WITHOUT disclosing your identity.  Our industry would not exist were there not the absolute need to preserve...