Privacy Beat

Your weekly privacy news update.

1. French and U.K. data-privacy regulators release guidance on consent and use of personal data

In France and in the United Kingdom, data-privacy regulators released guidance on when users of personal data have to have permission to collect it.

In France, data-privacy authorities are signaling that the way most websites are asking for “consent” from users to track them with cookies doesn’t pass muster with the General Data Protection Regulation — at least in France. The French data protection authority -— CNIL — last week, in guidelines, gave web operators six months from early next year to come up with something different that constitutes consent “freely given, specific, informed and unambiguous.”

When you arrive at a website and it displays a banner saying it uses cookies, and you then go ahead without giving specific consent to cookie use, that’s not valid consent, the French say. Not only is the site operator using cookies and trackers considered to be a “data controller” under GDPR, so are any parties who have put third-party cookies on the site. They must independently seek consent from the web visitor.

“The new guidelines confirm that continuing to browse a website after its cookie banner is displayed will no longer be considered to be valid consent for cookie use in France,” according to analysis by Patrice Navarro, a Hogan Lovells (U.S.) LLP attorney who posted on the subject July 22. Operators that use cookies and trackers will have to be able to prove that they have obtained explicit consent from the user. 

Navarro writes the new guidelines apply to all types of operations involving cookies and trackers on any type of device, including smartphones, computers, connected vehicles and any other object connected to a telecommunications network open to the public. She says GDPR Article 26, where there are joint data controllers involved, there must be a contractual arrangement among them for how valid data consent is obtained. An ad network, for example, would be a data processor required to get consent for cookie placement, unless it does nothing with any data collected, in which case it would be considered merely a data processor.

But that’s just in France. GDPR enforcement is varying widely by country, according to analysis by reporter Neil Lodge of Compliance Week in a July 19 nation-by-nation analysis. 

For guidance on the distinction between a data controller or processor, a good primer is supplied by Jessica Davies, on a July 22 post on DigiDay entitled, “WTF is a data controller vs. data processor.” 

Meanwhile, in the United Kingdom, the Information Commissioner’s Office published draft guidance on the way it plans to enforce GDPR and other rules regarding the lawful basis for user-data processing, and obligations to be transparent and to record processing activities for audit purposes.  Interested parties have until Sept. 9 to offer comments. The draft text may be viewed here

“Data sharing brings many benefits to organizations and individuals, but it needs to be done in compliance with data protection law,” said Steve Wood, ICO’s deputy commissioner for policy, in a note accompanying the draft enforcement guidelines. “Our draft data-sharing code gives practical advice and guidance on how to share data safely and fairly, and we are encouraging organizations to send us their comments before we launch the final code in the autumn.”

As with the French guidance, the ICO release prompted legal observers to encourage data-sharing agreements which set out the purpose for sharing, a lawful basis for doing so, what happens to data at each stage and defines roles. 

So many people have told us this newsletter is valuable.
Please support the continued work of ITEGA to foster a digital marketplace that respects privacy and identity.

Donate

2. British academic researchers cast doubt on the ability of technology to truely “anonymize” user data

Writing in the journal Nature Communications, British and Belgium computer scientists said July 24 they have figured out how to re-identify personal information that has been anonymized to a degree previously thought to be secure — and have posted their software for doing so in public. 

Anonymization requires more than simply adding noise, sampling datasets or other de-identification techniques, the authors say. They say their findings should be a wake-up call for policymakers on the need to tighten the rules for what constitutes truly anonymous data.

“We here propose a generative copula-based method that can accurately estimate the likelihood of a specific person to be correctly re-identified, even in a heavily incomplete dataset,” wrote Luc Rocher, Julien M. Hendrickx and Yves-Anexandre de Montjoye. The researchers are at Imperial College London and Universite Catholique de Louvain, in Belgium.

In reporting on the article, The New York Times said it is not the first time that anonymized data has been “show to be not so anonymous after all” — citing the 2016 identification of individuals in supposedly anonymous DNA databases in Germany.

3. “Watchdogs” EPIC and CDD admonish senators to avoid closed discussions with tech companies on privacy policies

The Electronic Privacy Information Center and the Center for Digital Democracy are calling foul in a letter to two senators for a decision to meet with tech companies about online privacy in a closed-door session of the Senate Judiciary Committee’s Task Force on Privacy.  The meeting took place last week among Snap, Mozilla and Sen. Marcia Blackburn, R-Tenn. The letter dated July 19 was addressed to Blackburn and Sen. Diane Feinstein, D-Calif. 

“We need you to pursue an open and inclusive process that ensures that meetings are held in public, that a record is established, and that the voices of consumers are heard,” the letter said in part. The groups call for federal privacy legislation and an independent data-protection agency, adding:  “We can no longer let industry groups and ineffective agencies decide how much privacy Americans may have.”

Other signers of the letter included Consumer Action, the Consumer Federation of America, the Government Accountability Project and Privacy Rights Clearinghouse. 

4. Publishers must ponder paywall behavior with new Chrome browser “incognito” change

Google’s announced decision to obscure the use of “incognito” privacy mode in next week’s release of its Chrome 76 browser update is going to force new decisions on publishers using metered paywalls to control content access.

Google explained the change in a blog post, entitled: “Protecting Private Browsing in Chrome.”

“This will affect some publishers who have used the loophole to deter metered paywall circumvention, so we’d like to explain the background and context of the change,” Google’s Barb Palser, partner development manager, news and web partnerships, wrote in the post.

Until now, websites could tell a user was accessing pages “incognito” because the browser software turned off the Chrome FileSystem API and the webserver could see that. Many sites used that to trigger a challenge to the user asking them to either register, subscribe or be denied page views. 

But now, an “incognito” user will look like any other user arriving and will receive access to the same number of metered pages as other unregistered guest users, analysts suggest. 

“We want you to be able to access the web privately, with the assurance that your choice to do so is private as well,” Google’s Palser wrote in the blog post. 

Already, however, a researcher said he had figured out another way that web sites can detect a user is operating in “Incognito.” 

RELATED:

5. RESOURCE: Side by side comparison of personal data/information  in CCPA vs. GDPR:

A Washington, D.C., privacy attorney has authored a side-by-side comparison of how the terms personal information and personal data vary depending on the law consulted. 

David Zetoony compares the “personal information” language in Section 1798.140(0)(1) of the California Consumer Privacy Act with the “personal data” definition in the General Data Protection Regulation’s Article 4(1). 

CCPA — “Personal information” means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. Personal information includes, but is not limited to, the following if it identifies, relates to, describes, is capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household.

GDPR —  “Personal data” means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identify of that natural person.

Zetoony argues that under the California law, hashed data that would be hard to re-identify may not be considered “personal information,” but it still might be considered “personal data” in EU jurisdictions.

6. With “sale” broadly defined, CCPA amendment could  prohibit loyalty-program data sharing, attorneys say

Swapping consumer data among loyalty, rewards, premium feature, discount or club programs may require user consent under a proposed amendment to the California Consumer Privacy Act (CCPA), attorneys say.

Writing in their firm’s blog, “Global Privacy Watch,”  John Tomaszewki and Jason Brieve of the Seyfarth Shaw LLP firm cited an amendment to Assembly Bill 846, adopted in Senate committee on July 11 and now headed for a possible floor vote in August. The amendment says: “A business shall not sell the personal information of consumers collected as part of a loyalty, rewards, premium features, discounts, or club-card program.” 

They argue that the CCPA’ s definition of “sale” is so expansive as to include data sharing among programs. The CCPA defines selling as: ““[S]elling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.”

Even if a business is not receiving monetary payment in exchange for consumer data, it could still be considered to be selling under the CCPA in some circumstances, say attorneys for another firm, Womble Bond Dickinson (US) LLP, in a July 18 blog post. 

“If the vendor is authorized to “mine” the data for its own purposes, that is likely sufficient consideration to make the disclosure to the vendor a sale under the CCPA,” wrote Womble Bond Dickinson attorneys Nicole Su, Ernesto Mendieta and Nadia G. Aram. 

Meanwhile,  the San Francisco Chronicle’s tech reporter Dustin Gardiner has checked on things in Sacramento and he writes that the “Fight to change California’s landmark consumer privacy law fizzles — for now.” He reports privacy watchdogs who backed the law’s adoption last year say they’ve held the line against attempts to weaken it.

7. Federated Learning — Google’s new, privacy-safe method of collecting and aggregating data

Instead of sending user data off of devices to be analyzed by machine learning, Google’s new privacy strategy is doing the machine learning directly on eligible devices before sending encrypted training results back. They are calling this approach ‘Federated Learning’. 

Previously published research on Federated Learning indicates the company has been working on the system since at least 2016. Although, they recently launched a consumer-facing site that uses a comic strip approach to explain the somewhat complicated methodology in easier to understand terms. The comic asserts that “…we will be able to learn about everyone…without learning about anyone!” 

Some are calling the promise of Federated Learning, “a new dawn for AI.” Rather than centralizing training data on one machine or data center, the training is decentralized to individual mobile devices, thanks to updates in mid-to-late 2018 to smartphones — equipping them with AI chips and more computing power. One of the developers of the technology is S20.ai, a startup that creates “tools for hyper-personalized and privacy-preserving AI Experiences.”

LEARN MORE: https://federated.withgoogle.com/

8. Business benefits of good data protection processes

An article this month in Computer Weekly encouraged organizations to think bigger than avoidance of fines from GDPR and focus on the “benefits of doing the right thing”. The article makes the assertion that following good practices under GDPR will increase consumer confidence, and lead to more insightful marketing and better data management. One example given is “having a formal deletion process has serious benefits for an organisation, allowing it to manage and understand the data it holds, remove low-value data and, in turn, reduce data storage costs.”

“The improvement in data quality will unlock opportunities for more discerning marketing that deploys contextual campaigns to groups or individuals (subject to consent, of course) that are far more likely to be successful in revenue terms.”

WASHINGTON UPDATE:

  • Attempts to enact a federal data-privacy law are “all very likely to add up to a big zero in this Congressional session,” according to an experienced tech reporter’s assessment of conversations with political insiders in both parties. That’s the assessment Fast Company’s Mark Sullivan
  • Members of the U.S. Congress have introduced at least seven privacy bills this year, but none have been put to a vote yet. DigiDay reporter Tim Peterson summarizes each of them.

QUOTE OF THE WEEK

So what’s wrong with free? It is always a lie, because on this earth nothing, in the end, is free. You are exchanging incommensurable items . . . Instead of paying — and signaling — with the fungible precision of money, you pay in the slippery coin of information and distraction . . . of all Google’s foundational principles, the zero price, is apparently it’s most benign. Yet it will prove to be not only its most pernicious principle but the fatal flaw that dooms Google itself… Google’s insidious system of the world will be swept away.

Tech prognosticator George Gilder, in his new book, “Life After Google: The Fall of Big Data and the Rise of the Blockchain Economy” published July 17.

TIDBITS

Equifax Settlement Teaches The Dos And Dont’s About Data Security | Nicole Kardel, Ifrah PLLC

Amazon Emerges As Google Challenger In Advertiser Perceptions SSP Report | Sarah Sluis (AdExchanger)

Five data privacy startups cashing in on GDPR | Paul Sawers (VentureBeat) 

Five details about AdBlock Plus you should know | Stephen Shankland (C|Net)

Nearly a third of EU firms still aren’t compliant with GDPR, says accounting company RSM | Sead Fadilpasic (ITProPortal) 

Trade Desk clashes with Google over transparency initiative | Sarah Sluis (AdExchanger) 

Rein in data brokers, Intel lawyer David Hoffman argues in NYTimes op-ed

CCPA: The start of a new era of consumer-privacy laws? | Kelsey Finch (Future of Privacy Forum)

Like what you see? Then recommend to a friend.

Follow ITEGA’s Facebook page for additional links and insights: https://www.facebook.com/itega.org

Subscribe to Privacy Beat

Share

Tweet

Forward

Copyright © *|CURRENT_YEAR|* *|LIST:COMPANY|*, All rights reserved.
*|IFNOT:ARCHIVE_PAGE|* *|LIST:DESCRIPTION|*

Our mailing address is:
*|HTML:LIST_ADDRESS_HTML|* *|END:IF|*

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

*|IF:REWARDS|* *|HTML:REWARDS|* *|END:IF|*