Current Region:
Global

UK: Age Appropriate Design Code.

The Age Appropriate Design Code is a statutory code of practice on standards of age appropriate design of relevant information society services (e.g. websites) which are likely to be accessed by children.  (Note: this is very different from the US COPPA law which applies to sites purposefully directed at children.)

It is also known as The Children’s Code.

The Code came into force in Autumn 2020 and the Information Commissioner’s Office is now enforcing it.  Many large platforms have already changed the way they operate to comply with it, including applying various forms of age assurance.

We advise reading the code in full – but key sections are replicated below.

How can Age Assurance help me comply with the Age Appropriate Design Code?

The first question to ask is whether you can simply make your site appropriate for all ages, by not including harmful content and not processing personal data.  If so, you need not read any further.

If you either have potentially harmful content on your site, use profiling based on user data or rely on consent for processing personal data, you should strongly consider using age assurance techniques to check that users only access age appropriate content, and that they are old enough to give consent for their personal data to be processed without the additional approval of a parent or legal guardian (13 in the UK, but this age differs across EU Member States – it is known as  “the digital age of consent”).

If it is proportionate to the risks to children which might result from your use of their data, then age estimation or more robust age verification techniques may be the best means of compliance. But when implementing age assurance, sites should be careful not to create further risks to personal data if the potential for harm does  not justify it. So, for example, if a child claims to be under 13,  or their parent suggests that they are, it is unlikely a site would need to conduct any further due diligence on that child’s age, because it will already merit protection from any serious harms (unless this is to prevent malicious adults contacting children online of course). Collecting extensive amouts of personal data to protect an 11-year-old child from a computer game with a PEGI 12 age rating, for example, may be hard to justify

In every case, information society services must balance the risk of harm against the impact on privacy and other child rights.

Breaches of the General Data Protection Regulations can result in fines of up to 4% of global turnover, and compliance is not restricted to sites based in the EU.

EU Application

The Irish Data Protection Commission is developing a similar code, “Children Front and Centre: Fundamentals for a child-oriented approach to data  processing”  which will extend similar requirements across the European Union for online services  established in Ireland for regulatory purposes. (Many of the best known platforms are regulated by the Irish DPC.)

Certification

GDPR legislation includes provision (Article 42) for a national regulators to approve independent assurance schemes.  The UK Information Commissioner’s Office (ICO) describes  this  process:

“Certification is a way for an organisation to demonstrate compliance with UK GDPR. Certification scheme criteria will be approved by the ICO and can cover a specific issue or be more general. Once an accredited certification body has assessed and approved an organisation, it will issue them with a certificate, and a seal or mark relevant to that scheme.

At a glance

  • Certification is a way to demonstrate your compliance with the UK GDPR and enhance transparency.
  • Certification criteria should reflect the needs of small and medium sized enterprises.
  • Certification criteria are approved by the ICO and certification issued by accredited certification bodies.
  • Certification will be issued to data controllers and data processors in relation to specific processing activities.
  • Applying for certification is voluntary. However, if there is an approved certification scheme that covers your processing activity, you may wish to consider having your processing activities certified as it can help you demonstrate compliance to the regulator, the public and in your business to business relationships.”

AVPA Audit Member, the Age Check Certification Scheme offers certification under GDPR,  approved by the ICO and the UK Accreditation Service.  Further information is available here.

CODE STANDARD 2

“Age-appropriate application: Consider the age range of your audience
and the needs of children of different ages. Apply the standards in this
code to all users, unless you have robust age-verification mechanisms to
distinguish adults from children..”

We recommend that you give your users a choice over the use of age verification wherever possible. In other words, we recommend that you provide a child-appropriate service to all users by default, with the option of age-verification mechanisms to allow adults to opt out of the protections in this code and activate more privacy-intrusive options if they wish.

If you believe only adults are likely to access your service so that this code does not apply, you need to be able to demonstrate that this is in fact the case. If you have robust age-verification in place, that will provide the clearest evidence, but you may also be able to rely on other documented evidence that children are not likely to access the service.

If you use age-verification, make it robust and privacy-friendly 

If you do use age-verification to allow you to tailor your service to adults without regard to this code, make sure the mechanism you use is robust and effective. You must be able to demonstrate that children cannot easily circumvent the age checks.

You may be able to collect and record personal data which provides proof of age yourself. If so, remember that you need to comply with data protection obligations in relation to your collection and retention of that data, including data minimisation, purpose limitation, storage limitation and security obligations. You must not use data collected for age-verification purposes for any other purpose.

Alternatively, you could consider using a trusted third-party age-verification service. This allows you to reduce the amount of personal data you collect, and take advantage of technological expertise and developments in the field. If you do use a third party service, you need to carry out some due diligence to satisfy yourself that their mechanism is suitably robust and compliant with data protection standards, and provide your users with clear information about the service you use.

Age-verification tools are still a developing area. The Commissioner will support work to establish clear industry standards and certification schemes to assist children, parents and online services in identifying robust age verification services which comply with data protection standards.

Tailor the measures in this code to the age range of your users

You need to take account of the information you have about the age of a child when applying the standards in this code, even if you don’t use age verification.

You can choose your approach to identifying the age of the child for these purposes:

  • Ask users to self-select their age range. This is not age verification, but allows you to tailor the experience to the declared age of the child to some extent (eg to meet transparency standards). However, you must ensure the service applies all the safeguards in this code and takes account of the best interests of children of all ages, irrespective of the declared age of the user.
  • Rely on use of the service itself to demonstrate the appropriate age range (or equivalent developmental stage) of users. This will only be appropriate if your service is specifically designed only for children of a particular age, and you can show that it is unlikely to be accessed by children at another stage of development (eg online services aimed at pre-literate children, or early readers). This allows you to fully tailor the experience to the age range of the target group when applying the standards in this code, as the risk to other children will be minimal.
  • Use robust age-verification measures. You do not have to use age verification, but this option allows you to give adult users more choices which may not comply with this code. It also allows you more scope to tailor your service to children of different ages, while still avoiding specific risks to children in other age ranges.

The Code suggests you can use age ranges as a guide to the capacity, skills and behaviours a child might be expected to display at each stage of their development in order to assess what is appropriate for children broadly of that age.  These are only suggested  ranges, and it may be more appropriate to adopt different age groupings in the context of the service.  For example, computer games  are sold with PEGI age ratings which might be a more  relevant break point in the ranges used by a gaming platform:

0 – 5: pre-literate and early literacy

6 – 9: core primary school years

10-12: transition years

13-15: early teens

16-17: approaching adulthood

13 is a particularly important age (in the UK) because of the parental responsibility verification requirements of Article 8 of GDPR.  These only allow children of 13 or over to provide their own consent to the processing of their data

If a user cannot prove they are over 13, they cannot give consent to data processing 

The consequence of this is that if you do not know the age of your users, you cannot rely on consent as a legal basis for processing personal data.

 

CODE STANDARD 11

“Switch options which use profiling off by default (unless you can demonstrate a compelling reason for profiling, taking account of the best interests of the child). Only allow profiling if you have appropriate measures in place to protect the child from any harmful effects (in particular, being fed content that is detrimental to their health or wellbeing).

Profiling is “any form of automated processing of personal data consisting of the use of personal data to evaluate certain aspects relating to a natural person, in particular to analyse or predict aspects concerning that natural person’s performance at work, economic situation, health, personal preferences, interests, reliability, behaviour location or movements.”

If your online service uses any profiling then you need to take appropriate steps to make sure that this does not result in harm to the child. In practice this means that if you profile children in order to suggest content to them then you need suitable measures in place to make sure that children aren’t ‘fed’ or presented with content which is detrimental to their physical or mental health or wellbeing, taking into account their age.

Content or behaviours that may be detrimental to children’s health and wellbeing (taking into account their age) include:

  • advertising or marketing content that is contrary to CAP guidelines on marketing to children.
  • film or on-demand television content that is classified as unsuitable for the age group concerned.
  • music content that is labelled as parental advisory or explicit.
  • pornography or other adult or violent content
  • user generated content (content that is posted by other internet users) that is obviously detrimental to children’s wellbeing or is formally recognised as such (e.g. pro-suicide, pro-self harm, proanorexia content. Content depicting or advocating risky or dangerous
    behaviour by children); and
  • Strategies used to extend user engagement, such as timed notifications that respond to inactivity.

Ultimately, if you believe that it is not feasible for you to put suitable measures in place, then you will not be able to profile children for the purposes of recommending online content. In this circumstance you need to make sure that children cannot change any privacy settings which allow this type of profiling.