IAB ranks privacy challenge as priority as Tech Lab seeks role in UID 2.0  identity “administration”; Schems raises heat on cookies

Privacy Beat

Your weekly privacy news update.

VIEW IN YOUR BROWSER

 

Graphic courtesy The Trade Desk/Getty Images

IAB ranks privacy challenge among policy priorities; Tech Lab seeks UID 2.0 identity “administration” role

The Interactive Advertising Bureau (IAB), in an online survey of its members introduced by its president, David Cohen, and distributed by email this week, appears to rank privacy as among its top priorities for 2021 and 2022.  The survey comes as the IAB Tech Lab seeks to find a role for itself in governing web identity (and hence privacy).

A listing of eight challenges/topic areas for the 650-member IAB lists first: “Privacy Now (privacy compliance, public-policy advocacy and technical solutions concerning recently enacted and potential new privacy laws.”   Other topics in descending order are The New Media Consumer, Tele://Vision, Diversity, Equity and Inclusion, Brand Disruption, News Saves Lives (supporting news and balance band safety and civic good through research and marketing efforts), The Future of Addressability and The Measurement Imperative.

In the forwarding email, Susan Hogan, IAB’s svp, research and analytics, said both IAB members and “active non-members” had received the invitation to complete the 15-minutes survey to learn about constituent priorities and get feedback on IAB’s performance.

IAB is weighing — through the survey — when members will be read to start attending F2F events as COVID restrictions ease. Meanwhile, its next virtual event is set for June 16: “State of Data Town Hall: Future of Contextual.” 

Last week, IAB Tech Lab and The Trade Desk (TTD) announced simultaneously that TTD had contributed code, ongoing maintenance and development for Unified ID 2.0 (UID 2.0) to the Tech Lab to make available under BSD-2 License, a form of “open-source.”  The code base for the federated identity service will presumably be available publicly.

Meanwhile, the Partnership for Responsible Addressable Media (PRAM), an ad-industry trade group, was quoted within The Trade Desk release as “intending to quickly finalize its technical and policy standards for Addressable Media Identifiers” and evaluate UID 2.0 for potential endorsement.

Not yet clear is what entity will have authority to sanction and remove entities from the use of UID 2.0 and for what reasons.  Roles for “Administrator” and Auditor” are not set yet, the IAB TEch Lab’s release, written over the name and photo of Alex Cone, VP, privacy and data protection, said the lab is “specifically focused on Tech Lab’s potential role as Administrator.”  Tech Lab has been working on an accountability platform.

AD TECH

PUBLISHING, PLATFORMS & PRIVACY

ANTITRUST

 

WASHINGTON WATCH

Does your organization need customized privacy compliance solutions? ITEGA  can help.

We bring together support you need to approach compliance with CCPA, GDPR if needed, and future privacy legislation as it emerges.

Learn More

 

CafeMedia studies Google FLoC — cohort-based privacy-in-the-browser; UDEX is compared

A former editor and Mozilla open-tech expert who now works for an ad-tech company is offering the first public look at the operation of “FLoC” — Goggle’s effort to control ad targeting inside its Chrome browser rather than with third-party cookies. Don Marti works for CafeMedia, which represents publishers selling their digital advertising. He’s also an advisor to the Information Trust Exchange Governing Association, sponsor of this newsletter.

“At this early stage, FLoC isn’t yet being used by advertisers to target and buy ads,” Marti  writes in a post dated May 27 on the CafeMedia website. “However, we can explore how FLoC cohorts line up to people’s actual content consumption interests, based on traffic trends across the 3,000+ sites that CafeMedia works with.”  In his blog post, Marti describes interest patterns evident in the “Origin Trial” of FLoC which Google has enabled in recent downloads of Chrome.

It takes some explaining to understand FLoC, and Marti did so as a test user in a discussion with Privacy Beat in these steps:

  1. f you browse the web with Chrome on your phone or other device, Chrome logs all the pages you visit.

  2. The new FLoC Javascript code lodged in the Chrome browser — which Google has made open source — analyzes just the domain names (not individual pages) and turns that history into a super long number/symbol sequence called a “CityHash”, says Marti.

  3. Then, the FLoC code on the browser creates a new long string called a “SimHash” (the term is discussed in this Google white paper).

  4. Once a week, the browser-based FLoC software contacts a Google server, sends it the unique SimHash and the Google server compares it to millions of other  SimHashes it has received across the web.

  5. The Google server sends back to the Chrome browser a cohort number (one of thousands) that best associates the submitted SimHash with like-interest SimHashes globally — say 17001. The Washington Post’s Aram Zucker-Scharff is among those raising concerns about sensitive cohort groupings.

  6. Now, when an advertising network seeks to place an ad in front of that browser’s user, it requests from the browser its latest cohort group ID (the 17001) deciding if the user of the browser is an attractive-enough target to bid on the ad position.

From an engineering point of view the approach is elegant but why does a user’s anonymized cohort assignment have to be inferred and managed by a Google-originated browser and a Google server?

One alternative approach is suggested by a draft technical specification developed in the open for ITEGA three years and known as “UDEX.ORG” In the UDEX model, the end user’s home base provides information about the user’s interests — either inferred or expressed — to a server system with the user’s ID obfuscated by hashing. The interest information is used by the UDEX service to assign the anonymous user potentially to multiple interest cohorts. An advertiser (see chart) can then make a decision to serve an ad by asking UDEX for the anonymous user’s interest-cohort profile. Rather than weekly updating, the UDEX profiles could be continuously updated based on new information from the user’s identity service provider — such as a publisher.  (See related: “An advertising awards show in the browser.”)

BROWSER / PLATFORM PRIVACY

PERSONAL PRIVACY

 

Schrems in video, above. See QUOTE OF WEEK, below) 

Schrems turns up heat on personal-data users over cookie consent; is a reckoning coming in the EU and California?

An Austrian-based lawyer and privacy evangelist is turning up the heat on both regulators and the users of personal data on the web with his own data-driven fight over cookies.

Max Schrem reported this week that his NOYB.EU (“none of your business”) advocacy group has sent 500 “draft” complaints to websites which he says are deliberately confusing consumers so that they won’t “opt-out” of the use of their data for cross-site tracking and other purposes.

“Everybody in Europe hates cookie banners,” Schrems was quoted by DW.com as saying.  “Many people feel that cookie banners are where the GDPR went horribly wrong, but the problem is not the GDPR, it’s crazy deceptive designs . . . “Companies use every trick in the book to make you hit this accept button.

Schrems, in a lively video, says NOYB is using technology to crawl thousands of European Union-serving websites and investigating how they present a required cookie “opt-out” choice. If sites don’t clean up their act within 30 days, he says, the draft complaints will be filed with General Data Protection Regulation (GDPR) enforcers capable of imposing billions in fines.

“Over the course of a year,  noyb will use this system to ensure compliance of up to 10,000 of the most visited websites in Europe,” NOYB says on its website. “If successful, users should see simple and clear “yes or no” options on more and more websites in the upcoming months.”

Of the 500 pages in its first batch of complaints, the BBC reported. 81% had no “reject” option on the first page, but rather hidden in a sub-page, it said. Another 73% used “deceptive colours and contrasts” to lead users into clicking “accept”, and 90% provided no easy way to withdraw consent, it said.

Thus in both California and the EU, privacy advocates are pushing regulators to enforce laws. Data users, including advertisers, are waiting to see how tough any crackdown will be and whether they can litigate their way to reduced fines. Key provisions of the California Privacy Rights Act don’t take effect Jan. 1, 2023, but groups are already seeking enforcement of its precursor, the CCPA.

EU PRIVACY

STATEHOUSE BEAT — Lobbyist pressure

WORLD PRIVACY

Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat

ANA delays deadline in search for vendor to author a report digging into efficacy of programmatic advertising

The Association of National Advertisers (ANA) is still looking for someone to write a tough-talking report on the effectiveness and trustworthiness of programmatic advertising.  The ANA said this week he had delayed for more than a month — to June 25 — the deadline for receiving proposals to review.  “No vendor has been selected yet,” said ANA spokesman John Wolfe.

In its April request for proposals (RFP), the ANA had set a May 14 deadline, and said the assumption that programmatic advertising costs are justified by precise targeting of audiences “needs to be challenged.”  The RFP states: “We believe there is substantial waste for advertisers within programmatic advertising. What we do not know is exactly how much waste there is nor the true sources of it.”

Adding Bob Liodice, ANA CEO, in an April news release: “We believe this lack of transparency is costing advertisers billions of dollars in waste.”  Moreover, the RFP text notes, it also means the majority of “spend” gets used up before it hits publishers’ coffers.

PRIVACY BUSINESS

UPCOMING EVENTS

QUOTE OF THE WEEK

noyb cites “cookie banner terror”; issues more than 500 GDPR-style complaints

  • The following is an excerpt from the NOYB.EU website regarding its May 31 announced effort to change the way European Union-focused websites present consumers with the option to “opt-out” of having data about them collected.

“Today, noyb.eu sent over 500 draft complaints to companies who use unlawful cookie banners – making it the largest wave of complaints since the GDPR came into force.

“By law, users must be given a clear yes/no option. As most banners do not comply with the requirements of the GDPR, noyb developed a software that recognizes various types of unlawful cookie banners and automatically generates complaints. 

“Nevertheless, noyb will give companies a one-month grace period to comply with EU laws before filing the formal complaint . . . Companies are served with an informal draft complaint via email and even get a step-by-step guide (PDF) on how to change software settings to comply with the law. If companies choose not to change their settings within a month, noyb will however file a complaint with the relevant authority, which may issue a fine of up to € 20 Million. . . . Over the course of a year,  noyb will use this system to ensure compliance of up to 10,000 of the most visited websites in Europe. If successful, users should see simple and clear “yes or no” options on more and more websites in the upcoming months . . . .

“A whole industry of consultants and designers develop crazy click labyrinths to ensure imaginary consent rates. Frustrating people into clicking ‘okay’ is a clear violation of the GDPR’s principles. Under the law, companies must facilitate users to express their choice and design systems fairly. Companies openly admit that only 3% of all users actually want to accept cookies, but more than 90% can be nudged into clicking the ‘agree’ button . . . .

“Some companies are clearly trying everything to make privacy a hassle for users, when they have a duty to make it as simple as possible. Almost all situations in which users are confronted with data protection are designed by companies. They often deliberately make the designs of privacy settings a nightmare, but at the same time blame the GDPR for it. This narrative is repeated on hundreds of pages, so users start to think that these crazy banners are required by law . . . . “

The GDPR was meant to ensure that users have full control over their data, but being online has become a frustrating experience for people all over Europe. Annoying cookie banners appear at every corner of the web, often making it extremely complicated to click anything but the “accept” button. Companies use so-called “dark patterns” to get more than 90% of users to “agree” when industry statistics show that only 3% of users actually want to agree . . . .”

ABOUT PRIVACY BEAT

Privacy Beat is a weekly email update from the Information Trust Exchange Governing Association in service to its mission. Links and brief reports are compiled, summarized or analyzed by Bill Densmore and Eva Tucker.  Submit links and ideas for coverage to newsletter@itega.org.

Share Share

Tweet Tweet

Share Share

Forward Forward

Facebook

Twitter

Website

Copyright © 2021 Information Trust Exchange Governing Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp