IAB sets webinar for public comment on privacy,  identity for ad targetting; publishers weigh reax; global opt-out affirmed for Calif.

Privacy Beat

Your weekly privacy news update.




Comments due on IAB Tech Lab specs for ad-industry ID system; publishers weigh involvement; who governs? March 24 webinar considers

The public faces an initial April 8 deadline to comment on draft rules about identity and privacy which the ad-tech-dominated IAB Tech Lab released last week.  Upcoming on Wednesday (March 24) is an “Addressability Solutions Road Show” — a public webinar — to explain a set of preliminary working documents and ideas designed to define and thread a needle between consumer privacy and targeted advertising.

The documents are circulating as U.S. publishers try to figure out if they will get on board an advertising- and ad-tech industry initiative to coalesce around an email-based ID system to replace third-party cookies.  The system, Unified ID 2.0, is to be operated by trade association PreBid.org Inc.

Google has said it will not support Unified ID (see link below).  Also unresolved, apparently, is who will set policy and governance for such a system. A key issue is said to be who will have authority to discipline or even cutoff companies that stray from privacy and system rules.

“Predictable user privacy — across platforms and devices — can only be achieved through open standards where all stakeholders have a say at the table,” said Jordan Mitchell, SVP and head of consumer privacy, identity and data for IAB. It must provide transparency and control to users and be able to demonstrate compliance, he added in a statement. 

Four downloadable documents comprise the public material developed by IAB in what it terms its “Standards For Responsible Addressability And Predictable Privacy” They cover such things as auditable data structures, formats for exchanging data rights and preferences, the use of email-based identities and the handling of ad targeting with no user ID is present. The Tech Lab documents say its mission is to “streamline technical privacy standards into a singular schema and set of tools which can adapt to regulatory and commercial market demands across channels.”

Accounts of the IAB privacy-data effort say nearly 300 people from ad-tech firms, ad agencies, media outlines and other identity-tech providers worked on the IAB document-drafting effort assisted by LiveRamp Holdings Inc., a company which acquires and works with consumer email addresses supplied by publisher clients in the programmatic advertising system. “We look at this as establishing the baseline,” Travis Clinger, svp and head of addressability and ecosystem at LiveRamp, told DigiDay’s Kate Kaye in her March 11 account, entitled “Gaps remain in industry guidelines for controversial email-based identity tech.”



Does your organization need customized privacy compliance solutions? ITEGA  can help.

We bring together support you need to approach compliance with CCPA, GDPR if needed, and future privacy legislation as it emerges.

Learn More



Mobile-phone users found “very to extremely” concerned re data use; but might agree to tracking for content

A majority of 1,500 smart-phone users surveyed last month would allow some form of tracking rather than pay a subscription fee, but some 45% prefer not to allow tracking, even though they realize they will see less-relevant advertising. Less than one-third are aware of Apple-led privacy changes coming and 45% prefer not to allow tracking, even though they realize they will see less-relevant advertising.

The findings are part of a survey funded and released by San Francisco-based mobile-ad measurement company AppsFlyer Inc., in collaboration with the non-profit Mobile Marketing Association.  The downloadable report is entitled, “Personal Data, Privacy & Smartphones: The Cautious Consumer.” 

Overall, the survey found smartphone users “very to extremely concerned about the use of personal online data by companies.” The concern spanned all ages, at 41% for those 18-24 “moderately” concerned and 38% of those 65 and older “extremely concerned.”

Many took action on their concern. The survey found 47% use an ad blocker and 35% use a browser extension eliminating ads — with percentages highest among youth and young adults. The highest concern (58%) was around identity theft. Some 45% wanted to receive some share of the revenue earned from their data.  Some 34% of respondents considered social-media data such as personal interests to be “sensitive” data, and of more concern than location data.

The survey asked questions about paying for information, perceptions of the purpose of data collection and knowledge of tracking or coming privacy regulations. Among conclusions was this: “Smartphone users want tech companies to step up, be transparent and educate them in ways that are simple and direct. ‘If you want my data, what do I get?’ is their mantra.”







Above: Optional “icon” specified for CCPA opt-in-out

California confirms enforcement of Global Privacy Control signal and names five-member privacy agency board

On schedule, California’s political leadership appointed the five members of a new California Privacy Protection Agency governing board. It will have authority to appoint an executive director and spend at least $10 million a year to promote, interpret and enforce the state’s pioneering online privacy laws. It’s members appear to have broad legal and advocacy experience around privacy.

Days before his confirmation as U.S. Health & Human Services secretary, the office of outgoing California Attorney General Xavier Becerra also released a final set of regulations interpreting the California Consumer Privacy Act (CCPA) in several significant ways affecting enforcement.

Significantly, the new regulations affirmed and amplified a “Tweet” sent by the office in February, the new regulatory amendments confirms it is mandatory for companies collecting consumer data online to honor requests from “authorized agents” of the public to opt-out of data collection or selling. This makes it potentially feasible for consumers to take a “one-and-done” approach to specify their privacy preference with the Global Privacy Control app.

The exact language covers “user-enabled global privacy controls, such as a browser plug-in or privacy setting, device setting, or other mechanism, that communicate or signal the consumer’s choice to opt-out of the sale of their persona information.” The regulations says businesses “shall treat” such controls “as a valid request”. The U.S. advertising industry had asked that the law not require respecting GPC-submitted requests.

The new regulatory language released on Monday also:

  • Adopted and publicized a new “icon” (shown above) that websites may use adjacent to a “Do Not Sell My Personal Information” opt-out statement.  Significantly, the rulemaking says use fo the icon is optional. Some privacy advocates expressed concern that the icon chosen might not be ideal.  It’s likely this will be the subject of future rulemaking.
  • Confirmed that collecting “personal information” in an offline setting — such as at a store during checkout — has to be accompanied by a sign, oral notice or a paper handout if the data is to be sold.
  • Defined and expanded rules forbidding the use of so-called “Dark Patterns” or multiple, confusing steps to reach a point where the “Do Not Sell” option can be exercised.  Getting there must be easy and require minimal steps, the attorney general’s office ruled, that does not have the “purpose or substantial effect” of impairing the right to opt-out.

Under CCPA the operative regulatory word is “sold.”  When the newer California Privacy Rights Act (CPRA) goes into effect next year, intended data “sharing” will also require notice and can be forbidden by a consumer through a blanket or one-off check off.



Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat




In Wired, Gilad Edelman summarizes FLoC and tradeoffs; including for journalism; is regulation an answer? 

“Google got some good press a few weeks ago when it announced in a blog post that it would be moving forward with its plans to remove third-party cookies from the Chrome browser. …. The most prominent proposal is something called Federated Learning of Cohorts, or FLoC . . . Under this proposal, instead of letting anyone track you from site to site, Chrome will do the tracking itself. Then it will sort you into a small group, or cohort, of similar users based on common interests. When you visit a new website, in theory, advertisers won’t see you, Jane C. Doe; they’ll just see whatever cohort you belong to, say, thirtysomething unmarried white women who have an interest in Bluetooth headphones . . . 

“Google’s planned changes address what I have called the Peeping Tom theory of privacy, which boils the concept down to the right to not have random strangers snooping on you. This is a totally inadequate definition, because it overlooks the collective dimension of digital privacy. Even if you, personally, avoid being tracked, you still live with the consequences of an economy built on monitoring people’s behavior and using it to target them with ads. The dominant model of advertising undermines quality journalism—an important pillar of democratic societies—by allowing the information logged about a reader to be used to target them more cheaply when they go elsewhere, subsidizing low-value and even fraudulent media. It also helps scammers and liars reach users across the web with little oversight, since everything is automated. It makes discriminatory practices extremely difficult to prevent or even detect, since discriminating between users based on their identities is built into the basic premise of microtargeting . . . The [Google-proposed] Privacy Sandbox fixes none of these issues. 

“Meanwhile, the non-Google ad tech industry is busy devising alternative techniques to track users that don’t rely on third-party cookies—including potentially more invasive methods like tying users to their email addresses rather than the current version of advertising IDs, which at least can easily be reset. “This doesn’t deal with the underlying problem of surveillance capitalism,” said Ashkan Soltani, a privacy researcher and former chief technologist at the Federal Trade Commission . . . 

“What would it look like for digital advertising to change in ways that took more of these issues into account? Alternate models exist. Contextual advertising allows ads to be targeted based on the content that a user is reading, listening to, or watching, without knowing anything about the user themselves . . . Google has said that it will support contextual and first-party targeting models, which could be promising. But, again, the devil will be in the details . . . 

“What’s dangerous is treating the end of third-party cookies as privacy itself, rather than an incremental shift that comes with its own set of trade-offs. This may be a familiar refrain at this point, but ultimately it’s going to be up to the government, not self-interested ad tech companies, to implement a regulatory framework that tackles the broad, collective dimensions of the digital privacy problem. Letting only Google know my secrets might be better than exposing myself to the whole ad tech industry, but not by a whole lot.”


Privacy Beat is a weekly email update from the Information Trust Exchange Governing Association in service to its mission. Links and brief reports are compiled, summarized or analyzed by Bill Densmore and Eva Tucker.  Submit links and ideas for coverage to newsletter@itega.org.

Share Share

Tweet Tweet

Share Share

Forward Forward




Copyright © 2021 Information Trust Exchange Governing Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp