s960 children school handsup stock no consent

UK now expects compliance with children’s privacy design code – TechCrunch

In the U.K., a 12-month grace period for compliance with a design code aimed at protecting children online expires today — meaning app makers offering digital services in the market that are “likely” to be accessed by children (defined in this context as users under 18 years old) are expected to comply with a set of standards intended to safeguard kids from being tracked and profiled. The Age Appropriate Design Code (the ‘Children’s Code’) came into force on September 2 last year; however, the U.K.’s data protection watchdog, the ICO, allowed the maximum grace period for hitting compliance to give organizations time to adapt their services.

But from today, it expects the standards of the code to be met. Services, where the principle applies, can include connected toys and games, edtech and online retail, and for-profit online services such as social media and video-sharing platforms with a strong pull for minors. Among the code’s stipulations is that a level of “high privacy” should be applied to settings by default if the user is (or is suspected to be) a child — including specific provisions that geolocation and profiling should be off by default (unless there’s a compelling justification for such privacy hostile defaults).

children

The code also instructs app makers to provide parental controls while providing the child with age-appropriate information about such tools — warning against parental tracking tools that could be used to silently/invisibly monitor a child without them being made aware of the active tracking. Another standard is at dark pattern design — warning app makers against using “nudge techniques” to push children to provide “unnecessary personal data or weaken or turn off their privacy protections.” The complete code contains 15 standards but is not itself baked into legislation — instead, it’s a set of design recommendations the ICO wants app makers to follow.

The regulatory stick to make them do so is that the watchdog explicitly links compliance with its children’s privacy standards to passing muster with more comprehensive data protection requirements baked into U.K. law. The risk for apps that ignore the standards is thus that they draw the watchdog’s attention — either through a complaint or proactive investigation — with the potential of a more comprehensive ICO audit delving into their whole approach to privacy and data protection.

“We will monitor conformance to this code through a series of proactive audits, will consider complaints, and take appropriate action to enforce the underlying data protection standards, subject to applicable law and in line with our Regulatory Action Policy,” the ICO writes in guidance on its website. “To ensure proportionate and effective regulation, we will target our most significant powers, focusing on organizations and individuals suspected of repeated or wilful misconduct or serious failure to comply with the law.”

It goes on to warn it would view a lack of compliance with the kids’ privacy code as a potential black mark against (enforceable) the U.K. data protection laws, adding: “If you do not follow this code, you may find it difficult to demonstrate that your processing is fair and complies with the GDPR [General Data Protection Regulation] or PECR [Privacy and Electronic Communications Regulation].”

In a blog post last week, Stephen Bonner, the ICO’s executive director of regulatory futures and innovation, also warned app makers: “We will be proactive in requiring social media platforms, video and music streaming sites, and the gaming industry to tell us how their services are designed in line with the code. We will identify areas where we may need to provide support or, should the circumstances require, we have powers to investigate or audit organizations.”

“We have identified that currently, some of the biggest risks come from social media platforms, video and music streaming sites, and video gaming platforms,” he went on. “In these sectors, children’s data is being used and shared to bombard them with content and personalized service features. This may include inappropriate adverts, unsolicited messages and friend requests, and privacy-eroding nudges urging children to stay online. We’re concerned with several harms that could be created due to this data use, which is physical, emotional, psychological, and financial.”

“Children’s rights must be respected, and we expect organizations to prove that children’s best interests are a primary concern. The code clarifies how organizations can use children’s data in line with the law. We want to see organizations committed to protecting children by developing designs and services following the code,” Bonner added. The ICO’s enforcement powers — at least on paper — are reasonably extensive, with GDPR, for example, allowing it to refine infringers up to £17.5 million or 4% of their annual worldwide turnover, whichever is higher.

The watchdog can also issue orders banning data processing or requiring service changes deemed non-compliant. So, apps that flout the children’s design code risk setting themselves up for regulatory bumps or worse. In recent months, there have been signs some significant platforms have been paying mind to the ICO’s compliance deadline — with Instagram, YouTube, and TikTok all announcing changes to how they handle minors’ data and account settings ahead of the September 2 date.

In July, Instagram said it would default teens to private accounts — doing so for under-18s in certain countries, which the platform confirmed to us includes the U.K. — among several other child-safety-focused tweaks. Then, in August, Google announced similar changes for accounts on its video-sharing platform, YouTube. A few days later, TikTok also said it would add more privacy protections for teens. However, it had also made earlier changes limiting privacy defaults for under-18s.

Apple also recently got itself into hot water with the digital rights community following the announcement of child safety-focused features — including a child sexual abuse material (CSAM) detection tool that scans photo uploads to iCloud and an opt-in parental safety feature that lets iCloud Family account users turn on alerts related to the viewing of explicit images by minors using its Messages app.

The unifying theme underpinning all these mainstream platform product tweaks is “child protection.” And while there’s been growing attention in the U.S. to online child safety and the nefarious ways in which some apps exploit kids’ data — as well as several open probes in Europe (such as this Commission investigation of TikTok, acting on complaints) — the U.K. may be having an outsized impact here given its concerted push to pioneer age-focused design standards. The code also combines with incoming U.K. legislation, which is set to apply a “duty of care” on platforms to take a broad-brush safety-first stance toward users, with a big focus on kids (and there it’s also being broadly targeted to cover all children; rather than just applying to kids under 13 as with COPPA in the U.S., for example).

In the blog post ahead of the compliance deadline expiring, the ICO’s Bonner sought to take credit for what he described as “significant changes” made in recent months by platforms like Facebook, Google, Instagram, and TikTok, writing: “As the first of its kind, it also has an influence globally. U.S. Senate and Congress members have called on major U.S. tech and gaming companies to voluntarily adopt the standards in the ICO’s code for children in America.” “The Data Protection Commission in Ireland is preparing to introduce the Children’s Fundamentals to protect children online, which links closely to the code and follows similar core principles,” he noted.

And there are other examples in the EU: France’s data watchdog, the CNIL, looks to have been inspired by the ICO’s approach — issuing its own set of child-protection-focused recommendations this June (which also, for example, encourage app makers to add parental controls with the explicit caveat that such tools must “respect the child’s privacy and best interests”). The U.K.’s focus on online child safety is not just making waves overseas but sparking growth in the domestic compliance services industry. Last month, the ICO announced the first clutch of GDPR certification scheme criteria — including two schemes focusing on the age-appropriate design code. Expect plenty more.

Bonner’s blog post also notes that the watchdog will formally set out its position on age assurance this autumn — so it will provide further steerage to organizations that are in the scope of the code on tackling that tricky piece. However, it’s still unclear how hard a requirement the ICO will support, with Bonner suggesting it could be actually “verifying ages or age estimation.” Whatever the recommendations are, age assurance services are set to spring up with compliance-focused sales pitches. Watch that space. Children’s online safety has been a massive focus for U.K. policymakers recently. However, the more comprehensive (and extended in train) Online Safety (neé Harms) Bill remains at the draft law stage.

An earlier attempt by U.K. lawmakers to bring in mandatory age checks to prevent kids from accessing adult content websites — dating back to 2017’s Digital Economy Act — was dropped in 2019 after widespread criticism that it would be both unworkable and a massive privacy risk to adult users of porn. However, the government did not decrease its determination to find a way to regulate online services in the name of child safety. And online age verification checks look set to be — if not a blanket, hardened requirement for all digital services — increasingly brought in by the backdoor, through a sort of “recommended feature” creep (as the ORG has warned).

The current recommendation in the age-appropriate design code is that app makers “take a risk-based approach to recognize the age of individual users and ensure you effectively apply the standards in this code to child users,” suggesting they: “Either establish age with a level of certainty that is appropriate to the risks to the rights and freedoms of children that arise from your data processing or apply the standards in this code to all your users instead.” At the same time, the government’s broader push on online safety risks conflicts with some of the laudable aims of the ICO’s non-legally binding children’s privacy design code.

For instance, while the code includes the (welcome) suggestion that digital services gather as little information about children as possible, in an announcement earlier this summer, U.K. lawmakers put out guidance for social media platforms and messaging services — ahead of the planned Online Safety legislation — that recommends they prevent children from being able to use end-to-end encryption. That’s right; the government’s advice to data-mining platforms — which it suggests will help prepare them for requirements in the incoming legislation — is not to use “gold standard” security and privacy (E2E encryption) for kids.

So the official U.K. government messaging to app makers appears that, in short order, the law will require commercial services to access more of kids’ information, not less — to keep them “safe.” Which is quite a contradiction versus the data minimization push on the design code. The risk is that a tightening spotlight on kids’ privacy is fuzzed and complicated by ill-thought-through policies that go platforms to monitor kids to demonstrate “protection” from an array of online harms — adult content or pro-suicide postings, cyberbullying, and CSAM.

The law looks set to encourage platforms to ‘show their workings’ to prove compliance — which risks resulting in ever-closer tracking of children’s activity, retention of data — and maybe risk profiling and age verification checks (that could even end up being applied to all users; think sledgehammer to crack a nut). In short, a privacy dystopia. Such mixed messages and disjointed policymaking seem set to pile increasingly confusing — and even conflicting — requirements on digital services operating in the U.K., making tech businesses legally responsible for divining clarity amid the policy mess — with the simultaneous risk of massive fines if they get the balance wrong.

Share

I have always enjoyed writing and reading other people's blogs. I started writing a journal as a teenager and have since written numerous books and articles. My blog is a place where I can write freely about my personal interests and those of others.

Leave a Reply

Your email address will not be published. Required fields are marked *