The UK Information Commissioner’s Office (ICO) has launched investigations into TikTok, Reddit, and Imgur to evaluate their compliance with the UK’s Children’s Code and data protection laws. These investigations focus on how these platforms collect, use, and safeguard the personal information of users under the age of 18. The outcome of these inquiries could significantly impact how online platforms operate in the UK and set new precedents for child data protection globally.

The Children’s Code: Strengthening Online Protections for Young Users

The code applies to UK-based companies and non-UK companies who process the personal data of UK children.

The ICO provides the 15 Standards of Age-Appropriate Design:1

Best Interests of the Child2Online services must always design with the child’s well‐being and rights as the foremost priority, ensuring that commercial objectives never override a child’s safety and privacy.
Data Protection Impact Assessments (DPIAs)Providers are required to conduct DPIAs to identify and mitigate any risks that their data processing might pose to children, taking into account differences in age, capacity, and developmental needs.
Age-Appropriate ApplicationServices should adopt a risk-based approach to establishing user age—either by accurately verifying age or by applying these standards uniformly to all users—to ensure that children receive proper safeguards.
Transparency3Privacy notices, terms, and policies must be clear, concise, and presented in language that is easily understood by children, with “bite-sized” explanations provided where necessary.
Detrimental Use of DataChildren’s personal data must not be used in ways that could harm their well-being or otherwise adversely affect their physical or mental health, including through manipulative practices.
Policies and Community StandardsProviders should uphold and clearly communicate their own rules—such as privacy policies, age restrictions, and content guidelines—to ensure consistency and accountability in protecting children’s data.
Default SettingsServices must offer the highest level of privacy by default, unless there is a compelling reason for a different configuration, ensuring that children benefit from maximum protection without needing to change settings themselves.
Data MinimizationOnly the minimum personal data necessary to deliver a service element should be collected and retained. This principle ensures that children are not subjected to excessive data collection.
Data SharingChildren’s data should not be disclosed to third parties unless there is a clear and compelling reason that aligns with the child’s best interests, minimizing potential external risks.
GeolocationGeolocation features must be switched off by default to protect a child’s physical safety, with clear indicators when location tracking is active and safeguards to prevent persistent tracking.
Parental ControlsIf a service includes parental control features, it must clearly notify the child when such monitoring is in place, balancing parental oversight with the child’s awareness and privacy rights.
ProfilingThe use of profiling should be disabled by default for child users—ensuring that any personalization or advertising based on profiling is only enabled if it is demonstrably safe and in the child’s best interests.
Nudge TechniquesServices must avoid using design techniques that encourage or “nudge” children into providing additional personal data or relaxing their privacy settings, thereby protecting them from inadvertent data disclosure.
Connected Toys and DevicesFor services that include connected toys or devices, providers must incorporate effective tools and design measures to ensure that these products comply with the Code’s privacy and data protection standards.
Online ToolsProviders should supply accessible and prominent tools that enable children to easily exercise their data rights—such as accessing, correcting, or deleting their data—and to report concerns.

The Children’s Code, formally known as the Age-Appropriate Design Code, was introduced in 2021 as part of the UK’s commitment to protecting children’s digital rights. The code requires online services that are likely to be accessed by children to prioritize their privacy, security, and well-being. It mandates strict limitations on data collection, the use of profiling, and the application of behavioral advertising to underage users. Platforms must also ensure transparency by presenting privacy policies in language that children can easily understand.

Since its enforcement, the Children’s Code has been instrumental in reshaping the approach of social media and video-sharing companies toward child safety. However, as technology evolves, new challenges continue to emerge, necessitating ongoing regulatory oversight and enforcement.

INFOGRAPHIC ON STEPS TO ENSURE COMPLIANCE WITH THE UK CHILDREN’S CODE

📌 Summary Compliance Checklist:

1. Conduct & document DPIA (Find the U.K.’s Self-assessment tool here)

2. Default privacy settings are enabled.

3. Child-friendly privacy notices.

4. Ensure design prioritizes children’s best interests.

 5. Geo-location off unless explicitly justified.

 6. Engage privacy compliance partner (e.g., PRIVO).

 7. Periodically review & update compliance measures.

Following these steps comprehensively and proactively ensures general compliance with the U.K. children’s code while providing robust protection and clarity for child users.
 For a tailored compliance checklist specific to your company’s needs, consult a qualified privacy lawyer.

The UK Children’s Code v. the U.S. Children’s Online Privacy Protection Act

The UK Children’s Code (also known as the Age-Appropriate Design Code) and the U.S. Children’s Online Privacy Protection Act (COPPA) both aim to protect children’s online privacy but differ in scope, requirements, and enforcement. COPPA, enforced by the Federal Trade Commission (FTC), applies to online services directed at children under 13, requiring parental consent for data collection and placing restrictions on targeted advertising. In contrast, the UK Children’s Code, enforced by the Information Commissioner’s Office (ICO), has a broader scope, covering all users under 18 and focusing on data minimization, default privacy protections, and age-appropriate transparency. While COPPA primarily regulates how businesses collect and share children’s data, the Children’s Codeemphasizes design principles that prioritize children’s best interests, making it a more comprehensive framework for digital services operating in the UK.

Investigation into TikTok: Algorithmic Risks and Targeted Content

The UK Information Commissioner’s Office (ICO) has launched investigations into TikTok, Reddit, and Imgur to evaluate their compliance with the UK’s Children’s Code and data protection laws. The UK Information Commissioner’s Office (ICO) has launched three investigations into TikTok, Reddit, and Imgurto assess their compliance with data protection laws and the UK Children’s Code. These investigations aim to determine how these platforms collect, process, and safeguard children’s personal information, particularly in recommender systems and age assurance measures.

The ICO’s TikTok investigation focuses on how the platform uses the personal data of 13–17-year-olds to recommend and deliver content, addressing concerns about children being exposed to inappropriate or harmful material. Concurrently, the ICO is examining Reddit and Imgur’s age verification mechanisms to ensure proper safeguards are in place for child users.

These investigations are part of broader efforts to enforce children’s privacy protections, ensure compliance with data protection laws, and hold companies accountable for designing child-safe digital services. The ICO has emphasized that while innovation is welcomed, it cannot come at the expense of children’s privacy and safety online.

Recent Regulatory Progress in Child Data Protection

Since the Children’s Code took effect, several high-profile platforms have implemented changes to their services in response to ICO guidance:

  • X (formerly Twitter): Stopped serving personalized ads to users under 18 and disabled geolocation sharing options (ICO, 2024).
  • Sendit: Removed automatic geolocation data from children’s profiles, enhancing privacy (ICO, 2024).
  • BeReal: Prohibited minors from sharing precise location data (ICO, 2024).
  • Dailymotion: Strengthened privacy policies to discourage children from oversharing personal information (ICO, 2024).
  • Viber: Disabled behaviorally targeted advertising for children, ensuring that their digital footprint is not used for marketing purposes (ICO, 2024).

The ICO also released a comprehensive progress report comparing privacy practices across 29 social media and video-sharing platforms. This transparency initiative aims to educate parents, policymakers, and businesses on best practices for child data protection 

The Future of Children’s Online Privacy: ICO and Ofcom’s Collaborative Efforts

The ICO is working closely with Ofcom, the UK’s communications and online safety regulator, to enforce child protection laws under the Online Safety Act. This collaboration ensures a unified regulatory approach to tackling online risks, including harmful content exposure, cyberbullying, and child exploitation.

Under UK law, a child is legally defined as anyone under 18, following the UN Convention on the Rights of the Child. Platforms that fail to implement adequate safeguards risk facing severe legal consequences. 4

As investigations into TikTok, Reddit, and Imgur continue, the ICO remains committed to strengthening online privacy protections for children. These ongoing regulatory efforts serve as a clear warning to tech companies that children’s rights and safety must be prioritized in digital spaces.

  1. Data Protection Act 2018, § 123 (U.K.), https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/age-appropriate-design-a-code-of-practice-for-online-services/standards-of-age-appropriate-design/.] ↩︎
  2. United Nations Convention on the Rights of the Child, art. 3, https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/age-appropriate-design-a-code-of-practice-for-online-services/1-best-interests-of-the-child/. ↩︎
  3. General Data Protection Regulation, art. 5(1), https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/age-appropriate-design-a-code-of-practice-for-online-services/4-transparency/. ↩︎
  4. Information Commissioner’s Office, Enforcement of the Children’s Code, https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/age-appropriate-design-a-code-of-practice-for-online-services/enforcement-of-this-code/. ↩︎

Leave a comment

Quote of the week

Civilization is the progress toward a society of privacy. The savage’s whole existence is public, ruled by the laws of his tribe. Civilization is the process of setting man free from men.

~ Ayn Rand