The Children’s Code, also known as the Age Appropriate Design Code, provides protection for children up to age 18 when they visit websites or download an app or game.
It contains 15 standards that companies will have to consider when designing their service or products, especially those aimed at children.
As a risk-based code, not all organisations will be expected to implement the standards as stringently – those organisations which use, analyse and profile children’s data will have the most to do.
The new statutory code aims to put children’s privacy at the heart of online services, pushing organisations to recognise children require special protection when online.
15 standards cover different elements of age appropriate design, specifically focused on services and products targeted at children.
These include high privacy default settings and a commitment to collect only the amount of data needed to provide the service which the child is participating in – i.e. no collecting data from an app to influence something completely different.
Administered by the Information Commissioner’s Office (ICO), the code applies to all major social media and online services used by children in the UK.
The standards are legally enforceable and the ICO has the power to impose fines of up to 4% of global turnover or £17m on a data controller.
What are the standards?
The 15 Children’s Code standards which designers have to consider are:
- Best interests of the child – Primary consideration when designing and developing services likely to be accessed by children
- Data protection impact assessments – Factor in different ages and different development needs of children
- Age appropriate application – Take a risk-based approach to recognising the age of users and applying the code standards to child users
- Transparency – Privacy information must be suited to understanding of child users
- Detrimental use of data – Organisations can’t use children’s personal data in ways detrimental to their wellbeing or go against regulations or Government advice
- Policies and community standards – Organisations must stick to their own published terms, policies and standards
- Default settings – ‘High privacy’ settings must be the default unless there is a compelling reason
- Data minimisation – Collect and retain minimum amount of data needed to access the service
- Data sharing – Organisations shouldn’t disclose children’s data unless there is a compelling reason
- Geolocation – Switch geolocations off by default unless there is a compelling reason and provide a signal to children when location tracking is active
- Parental controls – Give children age appropriate information about any parental controls including an obvious sign to the child they are being monitored
- Profiling – Profiling must be off by default unless there is a compelling reason
- Nudge techniques – Organisations must not use nudge techniques to encourage children to weaken their privacy protections or provide unnecessary personal data
- Connected toys and devices – Any connected devices provided by the organisation must conform to code
- Online tools – Organisations must provide accessible tools to help children report concerns about their data privacy
Organisations have 12 months to transition to the new arrangements, and the ICO will use their feedback to create packages of support to help others adapt their online products and services by September 2021.
Will the code work?
As we explain in our guide to whether the Government can protect children online, there is a long-running debate about how the Government can and should protect kids when they’re browsing and gaming online.
Back in 2015, Sky became a pioneer of switching parental controls on by default with their Sky Broadband Shield after it was revealed that only 4% of households were activating network level parental controls when they signed up to a broadband deal.
The ICO’s new code takes a different approach by forcing organisations to be proactive about child privacy when they’re designing a product or service rather than leaving every judgement to the end user – children or their parents.
The fact that organisations who target children need to do more than those who don’t makes sense, yet it sits uncomfortably alongside the Government’s decision to scrap pornography age verification proposals in October 2019.
As such sites aren’t targeted at children, they won’t necessarily be covered under the new ICO rules, although they are supposed to employ age verification techniques anyway.
Notably, the new rules do cover social media websites which weren’t covered in the age verification proposals. Read our guide on keeping kids safe on social media.