2 December 2020

Keeping Children Safe Online – The New 2020 Code

By Carol Tullo, OBE, Senior Consultant

In recent months, and perhaps a little under the radar with so much else going on, two significant initiatives that focus on the protection of children have been published.  The UK Government’s Online Harms White Paper in 2019 highlighted that 99% of 12-15 year olds in the UK were online and spending an average of twenty and a half hours a week on the internet.  UNICEF in 2017 estimated that a third of users of the Internet globally were under 18 and in the same year Ofcom indicated that 50% of 3-5-year-olds and 90% of 8-11-year-olds are online.  The ICO (the UK’s Information Commissioner’s Office) refers to one in five UK internet users being children.  Whichever way we look at the statistics, in a year which has seen a seismic shift to online conversations, education provision, entertainment and communication, initiatives to protect children and their data online is a concern for us all. 

The Data Protection Act 2018, section 123 states that “The Commissioner must prepare a code of practice which contains such guidance as the Commissioner considers appropriate on standards of age-appropriate design of relevant information society services which are likely to be accessed by children.”  This Age Appropriate Design Code was published and came into effect on 2 September[i] after consultation following an earlier draft published in January 2019.  There is now less than a year for organisations to get their act in order to comply by 2 September 2021.

The General Data Protection Regulation (GDPR) was incorporated into UK law in the Data Protection Act 2018 and applied in the UK from May 18, 2018. After EU exit day, references to GDPR mean the equivalent provisions in UK GDPR. GDPR applies in the UK in the same way as it did before EU exit day. At December 31 UK GDPR and the Age Appropriate Design Code remain in effect.

The Code is not new law but rather it sets out and explains how GDPR applies in relation to children using digital services. Data protection cannot be viewed in isolation or out of context. This is particularly relevant with respect to children and the risks they encounter online. Online harms exist, with few controls in place. Innovations in technology and technical standards must also be considered as they can mitigate those harms. This should be factored into any data protection assessment of risk, proportionality, and benefit.  Children are vulnerable data subjects and require extra care.  This is, in essence, what the Code sets out.

The Code contains 15 flexible standards of age appropriate design reflecting a risk-based approach.  It is designed to help where you may not be sure what to do, but it is not prescriptive. The aim is to give you enough flexibility to develop services which conform to the standards in your own way, taking a proportionate and risk-based approach. It will help you to design services that are GDPR compliant. The focus is on providing default settings which ensure that children have the best possible access to online services while minimising data collection and use.  The standards are:

  1. Best interests of the child
  2. Data Protection Impact Assessment
  3. Age appropriate application
  4. Transparency
  5. Detrimental use of data
  6. Policies and community standards
  7. Default settings
  8. Data minimisation
  9. Data sharing
  10. Geolocation
  11. Parental controls
  12. Profiling
  13. Nudge techniques
  14. Connected toys and devices
  15. Online tools

Age appropriate services are those GDPR compliant services designed for the age range of users with tailored messaging, controls and safety features appropriate to their age. Services include social media platforms, music and video streaming, providers of educational games and services, chatlines and online connectivity where children access your services.  The Code also ensures that children who choose to change their default settings get the right information, guidance, and advice and understand the implications. Settings must be ‘high privacy’ by default. Follow the ICO’s flowchart to assess the services you provide[ii].

In November, the UK Government published the Verification of Children Online (VoCO) report here , phase 2 of a child online safety research project that responds to the challenge of knowing which online users are children. It aims to bring about an internet that actively recognises children and adapts the spaces they use to make them safer by design and shape the government’s response to online harms. In furthering these aims, the programme of work included the testing of technical solutions, evaluation of Trust Frameworks and stakeholder engagement, through the eyes of parents, children and platforms.   The standards review in the research tested and mapped its results against the Code’s 15 standards to identify a Template Standard to deliver the VoCO Proposition that “If platforms could verify which of their users were children, then as a society we would be better empowered to protect children from harm as they grow up online…”

You must follow the standards as part of your approach to complying with data protection law. You must implement them all, to the extent they are relevant to your service, in order to demonstrate your compliance.

First steps for implementation:

  • Review your online services offered to children
  • Implement Child Safety focused Data Protection Impact Assessment as standard process
  • Seek parental consent
  • Standardize messaging and any dashboard features
  • Default standard settings for privacy, geolocation and safety
  • Adopt clear age appropriate messaging that adapts to the age ranges of your audience.

Your internal review and implementation should be monitored to measure the benefits and shared to encourage greater participation and uptake.  It will form part of your Record of Processing activity and compliance audit trail.  Parents and teachers will be reassured and have confidence that safety is built into services in a consistent, trusted way; children will understand what they can do; and platforms reduce the risk of harm by following the standards.  Win-win all round.

[i] https://ico.org.uk/for-organisations/guide-to-data-protection/key-data-protectionthemes/age-appropriate-design-a-code-of-practice-for-online-services/

[ii] https://ico.org.uk/for-organisations/guide-to-data-protection/key-data-protection-themes/age-appropriate-design-a-code-of-practice-for-online-services/annex-a-services-covered-by-the-code-flowchart/

© Naomi Korn Associates, 2020. Some Rights Reserved. The text is licensed for use under a Creative Commons Attribution Share Alike Licence (CC BY SA)

Disclaimer: The material in this blog post is for general information only and is not legal advice. Always consult a qualified lawyer about a specific legal problem.

Recent News

Back to News