Privacy Beat

Your weekly privacy news update.

1. Google drops major privacy / ad-tech plans in blog posts; seeks industry reaction; early thumbs down

Privacy and browser-tech managers at Google Inc. revealed major privacy and ad-tech initiatives in blog posts on Wednesday and Thursday, generally aimed at squelching practices of ad-tech competitors. Google defended the moves as focused on the user experience, supporting publishers, and also said it wanted its proposals and actions to be part of a broad effort to consult with advertisers, publishers, users, and others. 

IAlmost immediately, two computer scientists issued a point-by-point take-down of Google’s initiatives, saying it was “disingenuous” and absurd for Google to said that blocking cookies is bad for privacy.   Princeton University Profs. Arvind Narayana and Jonathan Meyer said Google is just protecting its own ad business. Narayana was a co-author of the Do Not Track standard and Meyer is an attorney. 

Google took particular aim at two things:

  • Moves by browser makers Apple and Mozilla to block most third-party cookies by default in new releases of the Safari and Firefox browsers.  Google said the moves will drive greater ad-tech use of undesirable “fingerprinting” technology.

  • It will move in the Chrome browser to block so-called “fingerprinting” by ad-tech companies, as not in the interest of users.

Because Google has direct log-in relationships with billions of web users, the two items, if implemented, will have little effect on the company’s dominant advertising business, but create headaches for competitors. 
In a detailed blog post, “Building a more private web,” Google’s engineering director for the Chrome browser, Justin Schuh, reiterated that the browser software would begin to discriminate in which third-party cookies it allows, rather than blocking them all. Google claims this aligns with publisher interests. He said Google will launch an open-source browser extension that displays information about ads shown to a user and which works across different browsers.   

Schuh also outlined a “Privacy Sandbox” effort. “New technologies like Federated Learning show that it’s possible for your browser to avoid revealing that you are a member of a group that likes Beyoncé and sweater vests — until it can be sure that your group contains thousands of other people,” he said. 

Chrome has offered a number of preliminary proposals to the web-standards community in areas such as conversion measurement, fraud protection and audience selection, writes Chetna Bindra, senior product manager, user trust and privacy, at Google, in another Aug. 22 blog post. She said Google “is committed to partnering with others to raise the bar for how data is collected and used.”  A day earlier, Bindra said Google was dropping its opposition to joining the IAB’s Transparency and Consent Framework in Europe, once version 2.0 rolls out next year. 

In a nod to publishers, Bindra cited in her blog post company research on the websites of 500 Google Ad Manager publisher clients in which Google removed its programmatic cookies. “For the news publishers in the studied group, traffic for which there was no [programmatic-ad] cookie present yielded an average of 62 percent less revenue than traffic for which there was a cookie present,” Bindra wrote.

In another forum, Google released what it termed “A Proposal: Giving users more transparency, choice and control over how their data is used in digital advertising.”  In the PDF document from Google, which does not cite an author, the company declares: “…[T]he open, ad-supported internet is at risk if digital advertising practices don’t evolve to reflect people’s changing expectations for privacy.”  

Google states three aspirations in the document:

  • As a company, it “should protect users against fingerprinting for purposes of ad tracking.” 

  • Users should be able to see and control what data is being collected, by whom and why, who is responsible for an ad and what caused it to appear. 

  • Users should have a way to easily access information about all companies involved in data collection and digital advertising, including ad platforms, ad-tech providers and data-collection domains. It suggests a “centralized registry” for this purpose. 


So many people have told us this newsletter is valuable.
Please support the continued work of ITEGA to foster a digital marketplace that respects privacy and identity.


2. Apple pressing down on tracking with both Safari and its new Sign In With Apple — warns of “unintended impacts”  — impeding competing Single Sign On (SSO) services?

New browser and login initiatives by Apple appear to be aimed at crushing third-party tracking of users. It’s a bid to make Apple’s Safari browser software more tuned into emerging user-privacy interests. But what could be the “unintended impact” — to use Apple’s own term?  Could competing Single Sign On (SSO) services be affected?

Apple’s moves come in two contexts: 

But here’s the challenge — what if a cookie, or a URL append is being used for a purpose that has nothing to do with privacy violation or ad tracking, but rather is providing the user a service that the user understands and wants?  The answer, say some observers, is that it will depend on how Apple implements.

As for Sign In With Apple.  For years, Facebook and Google users have been able to use their login credentials to access other websites — and a somewhat similar service called EduRoam allows academics to log into local wifi on hundreds of campuses.  Apple told developers June 6 it would ramp up this fall “Sign In With Apple” with a privacy-enhancing twist — you could make sure third-party websites would not receive your email address or any personal information other than your name. 

Buried in the fine print was the requirement that if your application uses any form of third-party sign-in Apple will require (last paragraph at this link) the app also offers Sign In With Apple.  So — it means Sign In With Apple is a forced option as a condition of staying in the Apply iOS ecosystem.   


3. When it comes to ad fraud and “surveillance marketing’, Augustine Fou is a data guru — but is anyone listening?

The way digital-advertising technology works today is one of the biggest methods for opaque grabbing of user behavior information.  And it’s also a world where it is hard to tell real users from computer bots.  

Augustine Fou spends his days trying to measure that world.  With a website and blog, Twitter and LinkedIN, Fou operates Marketing Science Consulting Group Inc. from a Fifth Avenue, New York address. He advises dozens of clients in the advertising, brand and publishing worlds about what he sees as pervasive “fraud” — billing for advertising views seen by machines. 

Take this week for example. On Thursday he posts to his LinkedIn page one more chart which he says shows a spike in website traffic — at the end of the month when people are looking to boost numbers. Is that people or bots?  “Any sites or apps that exhibit this phenomenon should be examined for fraud,” he writes. “Because how did they get so many humans to generate so many more impressions in the last 6 hours of the last day of the month?”

Fou dubs himself the “Ad Fraud Historian.” Because of his 20-year career spans work on the client, advertiser and agency (Omnicom, McCann/MRM) sides, Fou’s persistent hammering on those players to fix fraud comes from a place of knowledge. He holds a Ph.D. (in materials science and engineering) from MIT and does adjunct teaching at Rutgers and NYU.

But is anyone really listening?  He asks in a headline in another recent post:  “Do you buy the industry narrative about mobile ad fraud being almost non-existent”?

“ . . . I’ve been tracking what the industry trade associations have been saying about ad fraud — that fraud is low and going lower and they’re “winning.” — the “industry narrative” if you will,” he wrote in that post. “But how true is that, and does anyone still buy it?”

Or this, from another recent post:

“By selling media under conditions that do not support high third-party validation, even trustworthy media companies are essentially part of the problem, since the vast sea of good but low-transparency inventory provides cover for the fake inventory sold under the same formats.”

His most current piece de resistance is the long-form work posted by the online Linux Journal back in February:  ”What is ‘Surveillance capitalism?’ And How Did it Hijack the Internet?” In it, he argues that advertising-tech companies have promoted what Fou says is a myth — a myth that behavioral tracking of users as they move about the internet yields data that can make advertising more effective through hypertargetting. But he says all three parties — the consumer, publisher and marketer — lose in the equation:

  • The consumer surrenders her privacy

  • The marketer pays with wasted “ad spend” 

  • The publisher pays because so much “ad spend” goes to new tech middlemen



Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat

4. CCPA amendments positioned for up-down vote in Senate, then back to Assembly; companies urged to comply with current language

The California Senate is done with hearings on proposed changes in the California Consumer Privacy Act prior to it taking effect Jan. 1.  The Senate Appropriations Committee sent seven proposed changes to the Senate floor. if passed, they would go back to the Assembly for further action, which must be completed by Sept. 13. 

The posture of the amendments is leading law firms to advise websites which collect consumer data to not delay efforts to update their practices in conformance with the law, even though there are many uncertainties about how language will be interpreted by the public and by California Atty. Gen. Xavier Becerra.  

One challenge companies face is how to pull together consumer data they collect in many contexts into a single integrated technology for then revealing what they have to a user upon request, as the law will require. Companies need to start working with IT teams now to update data-storage systems and test before consumer data requests start coming in, one attorney said. 


5. TCF 2.0 released this week

On Tuesday The IAB Tech Lab and IAB Europe released the second version of the Transparency & Consent Framework (TCF) for conducting targeted advertising in compliance with GDPR. After input from stakeholders, 2.0 is a significant overhaul of the previous version giving publishers more options for collecting consent. The update also includes new registration requirements making companies disclose specific kinds of data they collect and that data’s role in ad campaigns. After addressing pushback related to the first version, IAB Europe and IAB Tech Lab expect the latest version to gain more traction, with increased adoption of the framework. Google has indicated that they will begin gradually implementing TCF 2.0 by the end of Q1 2020.


6. Can data privacy laws fight fake news?

In a recent article for Just Security, based at the Reiss Center on Law and Security at New York University School of Law, Alex Campbell outlines a case for how data-privacy laws will help fight the spread of disinformation on the internet, rendering fake news “a weapon without a target,” and making privacy matters issues of national security. 

“Absent the detailed data on users’ political beliefs, age, location, and gender that currently guide ads and suggested content, disinformation has a higher chance of being lost in the noise,” says Campbell.

He believes that existing data privacy laws can be improved to better fight disinformation, saying, “A U.S. national data privacy law would do well to include a GDPR-esque opt-in model of consent for data collection, processing, and sharing rather than opt-out or a simple notice of collection.”

Read More:

How Data Privacy Laws Can Fight Fake News (Just Security)


“First, large-scale blocking of cookies undermines people’s privacy by encouraging opaque techniques such as fingerprinting. With fingerprinting, developers have found ways to use tiny bits of information that vary between users, such as what device they have or what fonts they have installed to generate a unique identifier which can then be used to match a user across websites. Unlike cookies, users cannot clear their fingerprint, and therefore cannot control how their information is collected. We think this subverts user choice and is wrong.”

– Justin Schuh | Director, Chrome engineering at Google in Aug. 22 blog post: “Building a more private web.” 





Share Share

Tweet Tweet

Share Share

Forward Forward




Copyright © *|CURRENT_YEAR|* *|LIST:COMPANY|*, All rights reserved.

Our mailing address is:

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.