An international coalition of consumer protection, digital and civil rights organizations, and data protection experts has added its voice to growing calls for a ban on what’s been billed as “surveillance-based advertising”. The objection is to a form of digital advertising that relies upon a massive apparatus of background data processing which sucks in information about individuals as they browse and use services to create profiles that are used to determine which ads to serve (via multi-participant processes like the high-speed auctions known as real-time bidding). The E.U.’s lead data protection supervisor previously called for a ban on targeted advertising that relies upon pervasive tracking — warning over many associated rights risks. Last fall, the E.U. parliament also urged tighter rules on behavioral ads.
In March, a U.S. coalition of privacy, consumer, competition, and civil rights groups also took collective aim at microtargeting. So, the pressure is growing on lawmakers on both sides of the Atlantic to tackle exploitative adtech as consensus builds over the damage associated with mass surveillance-based manipulation. At the same time, momentum is building for pro-privacy consumer tech and services — showing the rising store placed by users and innovators on business models that respect people’s data. The growing uptake of such services underlines how alternative, rights-respecting digital business models are possible (and accessible, with many freemium offerings) but increasingly prevalent.
In an open letter addressing E.U. and U.S. policymakers, the international coalition — which is comprised of 55 organizations and more than 20 experts, including groups like Privacy International, the Open Rights Group, the Center for Digital Democracy, the New Economics Foundation, Bec, Edri, and Fairplay — urges legislative action, calling for a ban on ads that rely on “systematic commercial surveillance” of Internet users to serve what Facebook founder Mark Zuckerberg likes, euphemistically, to refer to as ‘relevant ads’. The problem with Zuckerberg’s (self-serving) framing is that, as the coalition points out, most consumers don’t want to be spied upon to be served with these creepy ads.
Any claimed ‘relevance’ is irrelevant to consumers who experience ad-stalking as creepy and unpleasant. (And imagine how the average Internet user would feel if they could peek behind the adtech curtain — and see the vast databases where people are profiled at scale so their attention can be sliced and diced for commercial interests and sold to the highest bidder). The coalition points to a report examining consumer attitudes to surveillance-based advertising, prepared by one of the letter’s signatories (the Norwegian Consumer Council; NCC), which found that only one in ten people are positive about commercial actors collecting information about them online — and only one in five think ads based on personal information are okay.
A full third of respondents to the survey were “very negative” about microtargeted ads — while almost half think advertisers should not be able to target ads based on personal information. The report also highlights a sense of impotence among consumers when they go online, with six out of ten respondents feeling that they have no choice but to give up information about themselves. That finding should be particularly concerning for E.U. policymakers as the bloc’s data protection framework is supposed to provide citizens with a suite of rights related to their data that should protect them against being strong-armed to hand over info — including stipulating that if a data controller intends to rely on user consent to process data, then consent must be informed, specific and freely given; it can’t be stolen, strong-armed or sneaked through using dark patterns. (Although that remains all too often the case.)
Forced consent is not legal under E.U. law — yet, per the NCC’s European survey, most respondents feel they have no choice but to be crept on when they use the Internet. That, in turn, points to an ongoing E.U. enforcement failure over major ad tech-related complaints, scores of which have been filed in recent years under the General Data Protection Regulation (GDPR) — some of which are now over three years old (yet still haven’t resulted in any action against rule-breakers). Over the past couple of years, E.U. lawmakers have acknowledged problems with patchy GDPR enforcement — and it’s interesting to note that the Commission suggested some alternative enforcement structures in its recent digital regulation proposals, such as for oversight of very large online platforms in the Digital Services Act (DSA).
In the letter, the coalition suggests the DSA as the ideal legislative vehicle to contain a ban on surveillance-based ads. Negotiations to shape a final proposal, which E.U. institutions will need to vote on, remain ongoing — but it’s possible the E.U. parliament could pick up the baton to push for a ban on surveillance ads. It has the power to amend the Commission’s legislative proposals, and its approval is needed to adopt draft laws. So there’s plenty still to play for. “In the U.S., we urge legislators to enact comprehensive privacy legislation,” the coalition adds. The team is backing up its call for a ban on surveillance-based advertising with another report (also by the NCC), which lays out the case against microtargeting — summarizing the raft of concerns that have come to be attached to manipulative ads as awareness of the adtech industry’s vast, background people-profiling and data trading has grown.
Listed concerns not only focus on how privacy-stripping practices are horrible for individual consumers (enabling the manipulation, discrimination, and exploitation of individuals and vulnerable groups) but also flag the damage to digital competition as a result of adtech platforms and data brokers intermediating and cannibalizing publishers’ revenues — eroding, for example, the ability of professional journalism to sustain itself and creating the conditions where ad fraud has been able to flourish. Another contention is that the overall health of democratic societies is put at risk by surveillance-based advertising — as the apparatus and incentives fuel the amplification of misinformation and create security risks, even national security risks. (Strong and independent journalism is also a core plank of a healthy democracy.) “This harms consumers and businesses and can undermine the cornerstones of democracy,” the coalition warns.
“Although we recognize that advertising is an important source of revenue for content creators and publishers online, this does not justify the massive commercial surveillance systems set up in attempts to ‘show the right ad to the right people’,” the letter goes on. “Other forms of advertising technologies exist, which do not depend on spying on consumers, and cases have shown that such alternative models can be implemented without significantly affecting revenue. “There is no fair trade-off in the current surveillance-based advertising system. We encourage you to take a stand and consider a ban on surveillance-based advertising as part of the Digital Services Act in the E.U. and for the U.S. to enact a long overdue federal privacy law.”
The letter is just the latest salvo against ‘toxic tech. And advertising giants like Facebook and Google have — for several years — seen the pro-privacy writing on the wall. Hence, Facebook’s claimed ‘pivot to privacy‘, its plan to lock in its first-party data advantage (by merging the infrastructure of different messaging products), and its keen interest in crypto. It’s also why Google has been working on a stack of alternative adtech to replace third-party tracking cookies. However, its proposed replacement — the so-called ‘Privacy Sandbox‘ — would still enable groups of Internet users to be opaquely clustered by its algorithms in ‘interest’ buckets for ad targeting purposes, which still doesn’t look great for Internet users’ rights. (And concerns have been raised on the competition front too.)
Where its ‘Sandbox’ proposal is concerned, Google may well be factoring in the possibility of legislation that outlaws — or, at least, more tightly controls — microtargeting. It’s therefore trying to race ahead with developing alternative adtech that would have much the same targeting potency (maintaining its market power) but could potentially sidestep a ban on ‘microtargeting’ technicalities by swapping out individuals for cohorts of web users. Legislators addressing this issue will, therefore, need to be smart in drafting laws to tackle the damage caused by surveillance-based advertising. Certainly, they will if they want to prevent the same old small- and large-scale manipulation abuses from perpetuating.
The NCC’s report points to what it dubs “good alternatives” for digital advertising models that don’t depend on the systematic surveillance of consumers to function. And which — it also argues — provides advertisers and publishers with “more oversight and control over where ads are displayed and which ads are being shown”. The problem of ad fraud is certainly massively underreported. But, well, it’s instructive to recall how often Facebook has had to ‘fess up to issues with self-reported ad metrics.“Selling advertising space without basing it on intimate consumer details is possible. Solutions already exist to show ads in relevant contexts, or where consumers self-report what ads they want to see,” the NCC’s director of digital policy, Finn Myrstad, noted in a statement. “A ban on surveillance-based advertising would also pave the way for a more transparent advertising marketplace, diminishing the need to share large parts of ad revenue with third parties such as data brokers. A level playing field would give advertisers and content providers more control and keep a larger revenue share.”
Leave a Reply