PRIVACY BEAT:  SCOTUS raises bar for privacy lawsuits; Google kicks the can; could FTC regulate ad-tech without new laws?

Privacy Beat

Your weekly privacy news update.

VIEW IN YOUR BROWSER


A case involving TransUnion (headquarters, above) has privacy implications

Supreme Court, siding with Facebook and Google, narrows options for privacy (and other) class-action plaintiffs; could this step up drive to document where data is flowing? 

In a closely watched case, the U.S. Supreme Court on Friday sided 5-4 with Google, Facebook and plaintiff TransUnion, acting to tighten the rules under which citizens can bring class-action lawsuits in privacy and other cases. However, one view is that this could increase the drive to document private abuses in the digital realm. 

MediaPost reporter Wendy Davis wrote about the case, TransUnion v. Ramirez, noting that in February the two companies plus eBay and three major tech-company trade associations had filed a  “friend of the court” brief writing that they are “frequently subjected to opportunistic lawsuits” based on alleged violations of privacy laws.

The case involved a $60-million judgement against TransUnion for allegedly failing to ensure the accuracy of credit reports. Sergio Ramirez argued it should compensate not only him, but 8,000 other individual class members for similar inaccuracies. TransUnion said the harm suffered by Ramirez was different from the others and shouldn’t be at issue. 

The 9th U.S. Circuit Court of Appeals disagreed with TransUnion, Davis wrote in February, ruling that all of the class members “suffered a material risk of harm to their concrete interests” due to TransUnion’s “failure to follow reasonable procedures to assure maximum possible accuracy” of the information it reported. The Supreme Court decision overruled the appeals court. The decision sets a new higher bar for proving “actual damages” that can be included in a class-action claim.

Even the 9th Circuit decision wasn’t particularly favorable to plaintiff Ramirez, according to a Harvard Law Review summary of the case and decision.  

Requiring a higher-standard for “harm” could give plaintiff’s lawyers the incentive and the right to compel discovery of how big social-media companies supply targeted data to scammers, Tweeted Don Marti, vp of ecosystem innovation at ad-tech company CafeMedia. “That’s probably the easiest concrete harm to show,” Marti wrote.  The result, he added, could be “more background info for the ban surveillance marketing advocates.” 
 

Google pushes end of third-part cookie tracking out two years

Google’s decision to delay for at least two more years the blocking of third-party cookies by its Chrome browser takes off the table a potential antitrust talking point for competing ad-tech and publishers.  But it is also a relief to them, because nobody is pleased with the alternatives yet.  There was a ton of coverage of the move, announced in a blog post on Thursday, and a few good links are below. 

ANTITRUST 

Does your organization need customized privacy compliance solutions? ITEGA  can help.

We bring together support you need to approach compliance with CCPA, GDPR if needed, and future privacy legislation as it emerges.

Learn More


Could the FTC under Lina Khan adopt a rule making unauthorized ad-tech sharing of user data illegal? 

An intriguing thought emerged this week from the ongoing coverage and analysis of the appointment of Lina Khan to chair the U.S. Federal Trade Commission (FTC).

Attention has been focused for a year or more on congressional efforts to move bills that would upgrade the United State’s definition of digital privacy rights to be competitive with European and California laws. But what if the FTC could simply write and adopt regulatory rules about the same thing and then fine violators? 

“The FTC could pass a rule that says: If you materially mislead a consumer through third-party information sharing, it is an unfair practice, and violators of that rule would have to pay money damages,” according to legal-scholar Christopher Hoffnagle, quoted deep in a Protcol.com article by Issie Lapowsky headlined: “More bad news for Big Tech: Lina Khan’s a privacy hawk, too.” Earlier, Hoffnagle is quoted as also observing that platform companies which “perennially violated users’ privacy would likely lose ground to more privacy-conscious rivals.”

The difficulty in the FTC approach, however, is that its ability to enforce rules lies in a relatively vague, but well-tested law that allows fines for “unfair and deceptive” trade practices. So a violator has room to argue over what is unfair and deceptive. For example, what if it were deemed an “unfair and deceptive” practice should the real-time-bidding advertising system be used by bad actors to share and aggregate user profiles without specific end-user permission? 

Pointing to one of its own rules would be something specific. But what might help even more would be if the FTC could point to a set of privacy-forward rules established by a non-profit governance and certification entity (like ITEGA, the publisher of this newsletter, “Privacy Beat.”. 

The idea of a “safe-harbor”-creating entity has already been broached by two Brookings Institution scholars, Cam Kerry and John Morris, in a white paper last year. It suggested that federal law preference entities that follow privately enforced rules around privacy and identity. They wrote: “One section would also authorize the FTC to approve private compliance programs which include ‘meaningful action’ for noncompliance. Such actions “may include” removal of a covered entity from the program, referral for enforcement, public reporting of disciplinary action, redress for individuals harmed or voluntary payment of federal fines.”

AD TECH

WASHINGTON WATCH 

PERSONAL PRIVACY 

STATEHOUSE BEAT 


A total of 55 global privacy groups, including many from the United States, released a letter this week to European Union authorities calling for a ban on “surveillance advertising.” The letter, released by the Norwegian Consumer Council, with an accompanying report doesn’t define anywhere what it means by “surveillance advertising,” however.  It does refer to “the rise of a surveillance economy where everything consumers do is being tracked both online and offline, aggregated and shared.” 

“ . . . As shown in the attached report, the surveillance-based advertising model facilitates systemic manipulation and discrimination, poses serious national security risks, funds disinformation and fraud, while also undermining competition and taking revenue away from content creators,” the letter says. “This harms consumers and businesses, and can undermine the cornerstones of democracy.”

The signatories include consumer- and civil-rights groups, NGOs and individual academics, including — in the United States — the Electronic Privacy Information Center, Ranking Digital Rights, Fight for the Future, the Center for Digital Democracy, Public Citizen, U.S. PIRG, Consumer Action, the Consumer Federation of America.

For an excerpt of the “call to action against surveillance-based advertising,” see QUOTE OF THE WEEK, below.

EU PRIVACY — Goggle investigation

 

NYTimes’ tech exec, on personal blog, calls for three changes in Google-CMA understanding; cites Google 2007 commitments 

A top data-science executive of The New York Times, writing on an explicitly personal — yet public — blog, is implying that Google needs to better live up to a 14-year-old pledge about data use — and he offers three suggestions for amending Google’s understanding with British regulators over the end of the third-party cookie.

Robin Berjon’s blog post quotes David Drummond, then Google’s SVP of corporate development and chief legal officer, in Sept. 27, 2007 testimony to a U.S. Senate antitrust subcommittee about the Google / DoubleClick merger:  “Again, no control over the advertising, no ownership of the data that comes with that that is collected in the process of advertising. That data is owned by the customers, publishers and advertisers, and Doublelick or Goggle cannot do anything with it.” 

Berjon goes on to assert that by 2018, Google wanted to own “audience data” in its European advertising services, yet expect to hold publishers legally responsible should it be misused. 

Now, in negotiations with the U.K.’s Competition and Markets Authority (CMA) over cookie deprecation. Berjon FIRST suggests that Google commits to:

  • Not using publisher data for any purpose other than those explicitly requested by the publisher as part of a service agreement”
  • Not using Chrome Sync data for any purpose other than the sync service, improving Chrome, or security” 
  • Addressing issues pertaining to AMP by reaching consensus with interested publishers in a W3C group prior to removing third-party cookies.”  (AMP stands for “accelerated mobile pages” a technology Google introduced a decade ago that has had the effect of storing news pages on Google servers. 

IDENTITY AND PRIVACY

Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat

Google consultant seeks input for new W3C discussion forum around cross-site “federated identity” sharing; charter language proposed

The World Wide Web Consortium (W3C) may be about to start hosting a new (and open to the public) discussion group seeking to develop standards for so-called “federated identity” — a method for allowing a user to have an account or identity at one website and be able to use that identity to be know to, or access resources — elsewhere.

“During the Federated Identity and Browser Workshop on May 25 & 26, we collectively decided to spin up a new community group focused on federated authentication,” says Heather Flangan, a consultant to Google Inc. “Several of us took the action to draft a charter for such a group, and we’d like your feedback.” 

The move is important because cross-site identity management is threatened by browser blocking of third-party cookies in Safari, Firefox, Edge, Brave — and eventually Google’s Chrome browser.  Publishers and many businesses are worried that federated single-sign on (SSO) and related cross-site services may be impacted.  The creation of the new W3C group would provide a forum for resolving those concerns.

The seven-page draft charter says the purpose of the Federated Identity Community Group is to both support federated identity “and prevent untransparent, uncontrollable tracking of users across the web.” It adds: “While the community group will take privacy concerns into consideration, these concerns will be balanced against the need to explore innovative ideas around federated authentication on the web.

PRIVACY BUSINESS 

PANDEMIC PRIVACY 

WORLD PRIVACY

 

MEDIA, NEWS AND DATA

UPCOMING EVENTS 

QUOTE OF THE WEEK

Is surveillance ad-tech: Manipulation, discrimination, security risk, disinformation, fraud?

  • Below is an excerpt from a letter signed by 55 NGOs and privacy advocates globally, including many in the United States, and released by the Norwegian Consumer Council on June 23. 

“In the EU, we urge you to consider a ban on surveillance-based advertising as a part of the Digital Services Act. In the US, we urge legislators to enact comprehensive privacy legislation.

“ . . . The surveillance economy is sometimes erroneously presented as a trade-off, where consumers allow companies to track them in order to receive access to digital content. As the attached report by the Norwegian Consumer Council shows, a majority of consumers do not wish to be tracked online. However, the ubiquity of commercial surveillance means that it is practically impossible to avoid being tracked, profiled and targeted.

“ . . . As shown in the attached report, the surveillance-based advertising model facilitates systemic manipulation and discrimination, poses serious national security risks, funds disinformation and fraud, while also undermining competition and taking revenue away from content creators. This harms consumers and businesses, and can undermine the cornerstones of democracy.

“ . . . Although we recognize that advertising is an important source of revenue for content creators and publishers online, this does not justify the massive commercial surveillance systems set up in attempts to “show the right ad to the right people”. Other forms of advertising technologies exist, which do not depend on spying on consumers, and cases have shown that such alternative models can be implemented without significantly affecting revenue. There is no fair trade-off in the current surveillance-based advertising system.”

ABOUT PRIVACY BEAT

Privacy Beat is a weekly email update from the Information Trust Exchange Governing Association in service to its mission. Links and brief reports are compiled, summarized or analyzed by Bill Densmore and Eva Tucker.  Submit links and ideas for coverage to newsletter@itega.org

Share Share

Tweet Tweet

Share Share

Forward Forward

Facebook

Twitter

Website

Copyright © 2021 Information Trust Exchange Governing Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp