PRIVACY BEAT: Could web browsers create privacy-protecting cohorts of users for serving advertising?

Privacy Beat

Your weekly privacy news update.

VIEW IN YOUR BROWSER

Could web browsers create privacy-protecting cohorts of users for serving advertising? UDEX as alternative

An ad-tech follower with many years of experience as a consulting corporate privacy lawyer is suggesting that the makers of web-browser software may impose a new approach to targeted advertising based upon “cohorts” big enough to, in theory, to assure anonymity for users.

Alan Chapell’s insight is contained an April 20 column he wrote on the AdExchange website called, “Planning for the future While Your Hair is on Fire.”  Chapell heads Chappell & Associates, in Brooklyn, N.Y., and Sausalito, Calif.

Chapell says the browser makers are discussing such ideas in a preliminary way in a World Wide Web Consortium (W3C) working group. He goes on to write:

“The browsers will create cohorts of users that are large enough so that it would be nearly impossible for anyone to identify any of the users in the cohort. My back-of-the-envelope guesstimate is that we’ll see cohorts of 500 to 2,000 users depending on the campaign. The basic idea is that ads get served into each cohort as a group, and that no entity within the ad chain will be able to break up a cohort. That may create some challenges for attribution (among other things), but let’s leave that to the side for now.”

Chapell writes that this approach creates the possibility for a lack of transparency in how the cohorts are selected and managed, meaning that browser makers — Google, Mozilla, Microsoft and Apple, primarily — “just become another set of walled gardens where everybody just needs to trust them?” 

The nonprofit World Wide Web Consortium (W3C) manages technical discussions on proposed web standards. One proposal pending in the W3C’s “Improving Web Advertising  Business Group” is called “restricting ad delivery by audience.”  Another independent proposal submitted to the W3C  is called “Federated Learning of Cohorts” (FLoC). It was first proposed in August but little discussion has occurred around it yet. The W3C also hosts a Privacy Community Group where browser makers have been among recent discussion participants.

(The nonprofit Information Trust Exchange Governing Association, sponsor of this Privacy Beat blog, has proposed the creation of an anonymizing user data exchange (UDEX.ORG)  that would be governed in the public interest and with transparent algorithms.)

Google created what it calls a “Privacy Sandbox” months ago to consider approaches to privacy if the use of third-party cookies — a mainstay of the current real-time-bidding advertising ecosystem — comes gradually to an end.  Read Sam Dutton’s Digging into Google’s “Privacy Sandbox” — 3rd party use without 3rd party cookies?

ADVERTISING TECH

Does your organization need customized privacy compliance solutions? ITEGA  can help.

We bring together support you need to approach compliance with CCPA, GDPR if needed, and future privacy legislation as it emerges.

Learn More

Tracking and tracing privacy benchmarks: Six questions from Britain’s privacy “czar” Elizabeth Denham

Britain’s information commissioner, Elizabeth Denham, has offered six questions that governments and privacy advocates might use to evaluate COVID-19 tracking and tracing initiatives.  Denham posted the six points in an April blog.

The six questions, detailed in the post here, are (1) have you demonstrated how privacy is built in to the processor technology (2) Is the planned collection and use of personal data necessary and proportionate? (3) What control do users have over their data? (4) How much data needs to be gathered and processed centrally? (5) When in operation, what are the governance and accountability processes in your organization for ongoing monitoring and evaluation of data processing  (6) What happens when the processing is no longer necessary?

COVID TRACING — US

COVID TRACING — WORLD


Brave now says GDPR “in danger of failing” for lack of enforcement resources; issues new report

Brave Software Inc. is seeking to keep up public pressure on European Union regulators to adopt its point of view and find the current real-time-bidding advertising system violates the landmark EU privacy law, the General Data Protection Regulation (GDPR).

But now it is documenting a lack of resources that regulators might need to prove the point.

The latest salvo is a data-laden analysis by Brave’s Johnny Ryan, picked up April 27 by The New York Times which says that the only major enforcement so far has been a 50 million euro fine of Google, described in The Times report as “about one-tenth of what Google generates in sales each day.” 

Brave lodged complaints with the European Commission naming 27 EU member states has allegedly failing to provide enough staffing and resources to their national data-protection watchdogs. In a statement on Brave’s website, Ryan said the result is that GDPR “is now in danger of failing.”  The statement provides five bullet-point examples of the alleged lack of resources. (DOWNLOAD BRAVE’S REPORT)

The issue is important to Brave because its upstart browser software, which is slowly gaining market share, is focused on banning most advertisements placed by the current real-time-bidding system and substituting a networked partnership among Brave, publishers and advertisers.

GDPR ENFORCEMENT

PERSONAL PRIVACY

BIOMETRICS, AI AND PRIVACY

CCPA WEEK FIFTEEN

As CCPA delete and “do-not-sell” demands ramp up, vendor report says low cost is $140K per million records

Privacy compliance vendor DataGrail Inc. published on Thursday a look at the volume and estimated cost of compliance with the California Consumer Privacy Act (CCPA) for its clients.  It’s estimate: Companies which manually process data subject requests (DSRs) or access requests can expect to spend between $140K and $275K per million customer records in their system.

DataGrail’s analysis and pitch for using their automated systems is in its report, Look Back at Early Trends Shaping the Privacy Landscape, authored by Daniel Barner, CEO of the two-year old San Francisco-based company. Three of the things the company highlighted:

  • January 2020 saw a surge of privacy requests, most likely due to the law going into effect and privacy policy updates.

  • Deletion requests were the most popular requests (40%) in Q1 2020, followed by DNS (33%), and access requests (27%).

  • Do Not Sell (DNS) requests will likely become the most dominant privacy request after analyzing early trending data.

The DataGrail report can be downloaded HERE. It’s findings are also reported by Ray Schultz at MediaPost/DigitaNewsDaily with the headline: “CCPA Drives Surge In Consumer Privacy Requests In Q1: Study.”

MORE ABOUT CCPA

Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat

Can premium content be exchanged for user data under CCPA? Lawyer: “Yes”, if transparent, reasonably valued 

Many aspects of the California Consumer Privacy Act (CCPA) cover new ground when it comes to defining terms and processes, and there is general agreement that face will lead to litigation and perhaps amendments in months and years ahead.

One area of ambiguity is around when a data handler can differentiate the price or terms of a service — such as a subscription — based on whether a person is willing to share data about themselves. The question is among those explored in an April 29 blog post, “The CCPA Non-Discrimination Right, Explained,” by four attorneys in the law firm of Kelley Drye & Warren LLP of Washington, D.C.

Privacy Beat gave co-author Alysa Z. Hutnik a call and posed this question: Would it be legal under the CCPA if a publisher of a largely free website offered premium content to a user, in exchange for private data, that would allow for a more personalized service, including advertising?

Hutnik, who heads the firm’s privacy practice, says there isn’t a simple answer, but that at the least, state regulations likely to accompany the law sometime after July 1 will expect such publishers to (a) disclose and be transparent about its terms and (b) have the value of the premium content be reasonably related to the value of the data sought from the user.

“There is recognition that they don’t want a law to stop free models,” Hutnik said of the California attorney general’s CCPA enforcers. “But if you were going to do the premium option you heed to have some reasonable basis as to the eventual value of the data. It doesn’t have to be an exact science, you don’t have to hire a economist, but they want some transparency. You have a lot of flexibility in how you do it.” 

For more on the subject of price-for-data discrimination in a different hypothetical use case — asking a consumer to pay for the privilege of not seeing ads, see Quote of the Week, below.

PRIVACY BUSINESS

PRIVACY LAW & POLICY

VIRTUAL EVENTS

QUOTE OF THE WEEK

Could CCPA regulations be interpreted to permit charging for the offer of an ad-free content service and not collecting any user data?

“If a business offers both a free service, and a premium service that costs $5-per-month (premium), and only allows the premium-service consumers to opt out of the sale of their personal information, the practice is discriminatory under the CCPA unless the premium payment is reasonably related to the value of the consumer’s data to the business. In this example, as one way to demonstrate the required value, a business may determine that the payment for the premium version offsets the revenue provided by placing ads in the free version…businesses should consider (a) which business practices may qualify as a financial incentive (or offering a different level or quality of goods or services) in connection with the collection, retention, or sale of personal information.”

– Excerpt from an April 29, 2020 post on the Ad Law Access blog maintained by the law firm of, authored by Alysa Zelter Hutnik, Aaron Bustein, Carmen Hinebaugh and Alex Schneider.

ABOUT PRIVACY BEAT

Privacy Beat is a weekly email update from the Information Trust Exchange Governing Association in service to its mission. Links and brief reports are compiled, summarized or analyzed by Bill Densmore and Eva Tucker.  Submit links and ideas for coverage to newsletter@itega.org

Share Share

Tweet Tweet

Share Share

Forward Forward

Facebook

Twitter

Website

Copyright © 2020 Information Trust Exchange Governing Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp