In Europe, GDPR sets a higher bar for the processing of special category data such as biometric data. However, it is wrong to believe this legally always requires consent. There are other legal reasons to process sensitive data, and perhaps the most relevant for age assurance, is the public interest. The UK Information Commissioner’s Officer recently issued a formal legal opinion accepting that it was in the public interest to process personal data in order to undertake age verification if the purpose was to protect children from harm, and the extent of that harm justified this use.
This included the use of sensitive personal data, such as biometric data e.g. facial images.
That said, there is a critical caveat to the ICO’s opinion – the data may only be used for the purpose it was acquired i.e. age assurance. It cannot then be the basis of, for example, targeting marketing or advertising towards a certain age group. So, large platforms which make the accumulation of data about their users central to their business model would need to carefully segment data – and not just sensitive data but all data – acquired for the purpose of age assurance. Any audit conducted to certify an AV provider would check carefully for any data leakage between different purposes.
As the legal basis for processing such data is the public interest and not consent, then there is no minimum age (if a service does rely on consent to process your data, then it has to get a parent or legal guardian to give consent on behalf of children in the UK under 13).