Facebook takes hits; questions anti-tracking initiative; LWV pans Prop24


Privacy Beat

Your weekly privacy news update.



Facebook takes multiple PR hits: Over EU data use, and from docu-drama; it will continue to collect IDFA unique user IDs

From a PR standpoint, it’s not been a good week for Facebook’s privacy reputation. Yet the company, after winning a concession from Apple on a privacy-tightening initiative, disclosed it will continue the user-tracking practice Apple had sought to cut off.

In a Sept. 10 note on its developer blog, Facebook said it would continue to collect IDFA unique identifiers from iPhone (iOS) users, but would not use them on Facebook apps — implying they would still pass the along to ad partners. It said it would engage with the W3C and PRAM on such privacy, identity and ad-tech issues.

  • The Wall Street Journal broke news and was followed by other news organizations (links below), reporting that Irish data-protection authorities had told Facebook that it can no longer legally export user data about its EU-based users into the United States, a position, if upheld, that could force Facebook to divide its operations datawise.
  • As if that wasn’t enough, the lawyer for Europe’s data-privacy activist, Maximilian Schrems, sent a letter to the Irish Data Protection Commission giving it until Sept. 10 to confirm it was pursuing Schrems seven-year-old complaint against Facebook Ireland Limited for alleged illegal use of user data.
  • As for film — the release by Netflix on Wednesday of a 90-minute docu-drama, “The Social Dilemma,” brings together a host of ex-social-media tech leaders and investors to accuse Facebook of fostering privacy, health and public-policy problems.

Tristan Harris, the ex-Google exec who now heads the Center for Humane Technology (CHT), who is featured in the documentary, Tweeted: “Community ownership and governance of technology platforms is a big part of the movement towards a more human future.” CHT also started a new podcast: “Your Undivided Attention.”

Even Marc Benioff, the CEO of SalesForce.com, one of Silicon Valley’s most successful data-management platforms, Tweeted a two-minute trailer for the documentary, asserting: “Everyone must watch this.”

Piling on was this Tweet from Jason Kint, CEO of Digital Content Next, the trade group of quality online publishers: “It’s EU data regulator (Ireland) warns to stop transferring data to the use; Germany already determined it has to stop collecting data cross its apps; it’s likely FB is violating California data law (and) FB lobbied Applie to push back IDFA changes for months.”



Does your organization need customized privacy compliance solutions? ITEGA  can help.

We bring together support you need to approach compliance with CCPA, GDPR if needed, and future privacy legislation as it emerges.

Learn More


LWV comes out against Prop24, citing price discrimination and pay-for-privacy; supporters cite other benefits

Trying to sort out the white hats from the black hats when it comes to California’s November ballot privacy initiative is taking more and more time.

This week the League of Women Voters of California is out with a detailed analysis of Prop24, the California Privacy Rights Act and urges its rejection. So did the California Nurses Association, and a couple of newspaper editorials.  This comes a week after ex-presidential contenter Andrew Yang stepped up to be a pro-Pro24 spokesperson.

The difficulty is in part the relationship between Prop24 and the state’s almost-new California Consumer Privacy Act (CCPA), which has been enforced since July 1. There are things in Prop24 which would augment, or change the CCPA.

Some opponents of Prop24 are arguing that the originator of both the law and ballot referendum — real estate developer Alastair Mactaggart — was too influenced by Facebook, Google and others in Silicon Valley in the drafting of Prop24. Another point opponents are pushing — the claim that Prop24 is written so that tougher privacy law will be foreclosed once it is enacted.  Mactaggart, and supporters say the opposite is true.

“Proposition 24 would allow customers to tell businesses not to share data about them,” wrote supporter Brian Schrader, in a San Diego business journal. “It would also allow customers to opt-out of having their sensitive personal information sold or used for advertising, and provide a host of additional protections for minors.”

The LWV position statement focused on the assertion that Prop24 will make it legal for websites to charge users extra for same services if they aren’t willing to share personal data, or give people rewards for data through loyalty programs.  The platforms haven’t said anything publicly about Prop24.  Prop24’s opponents included the ACLU of California, ConsumerAction and Color of Change.  Consumers Union and the Electronic Frontier Foundation are staying neutral





Foundation-backed researcher suggests Section 230 changes could leverage platforms away from harmful content — and toward supporting journalism

Three major foundations supported work on new ideas for how digital platforms might help sustain journalism and take on responsibility for curbing misinformation — by legislatively tweaking of Section 230 of the Communications Decency Act of 1996.

The latest output is a white paper out this week from a journalist-lawyer-turned-academic, Paul M. Barrett deputy director of New York University’s Stern Cetner for Business and Human Rights.

 “The benefits of Section 230 should be used as leverage to pressure platforms to accept a range of  new responsibilities related to content moderation,” Barrett writes in  “Regulating Social Media: The Fight Over Section 230 — and Beyond.”  He adds: “Platforms may reject those responsibilities, but in doing so they would forfeit Section 230’s liability protection — and probably a good deal of user and advertiser loyalty.” (See QUOTE OF THE WEEK, below, for additional excerpts)


A former journalist who now teaches at New York University’s business and law schools, Barrett proposes amendment of  “Section 230” — the 24-year-old federal immunity from content liability that Internet websites and platforms enjoy — to create levers that down on fake news and other harmful web content — and potentially help fuld journalism.

Barrett, who’s early career spanned two decades with the Wall Street Journal and Bloomberg News, also says a federal agency should be created to foster Internet trust. His idea — encourage platforms such as Facebook, YouTube and Twitter to moderate user-generated content and posts as a condition of continued legal immunity under Section 230. As well, he cites a 2019 University of Chicago paper which would have the platforms also contribute to a journalism funding  “voucher system” under which a federal taxpayer could allocate up to $50 a year to media outlets of their choosing.

Republicans who perceived bias against their party by Facebook and Google began suggesting changes in Section 230 more than a year ago.  But then former Vice President Joe Biden also suggested changes. As a result there are now a variety of bipartisan suggestions for tweaking or abandoning Section 230 of the Communications Decency Act of 1996.

Barrett’s white paper, released on Tuesday, Sept. .8 by NYU, plies a middle ground between abandoning the immunity law and adjusting it. It was produced with funding support from Craig Newmark Philanthropies, the John S. and James L. Knight Foundation and the Open Society Foundations.  In his acknowledgements Barrett cites the “time and insights” of dozen key experts.

“Section 230’s original purposes — incentivizing content moderation and protecting digital sites from getting sued into oblivion — remain valid,” Barrett writes in his recommendations. “The law should survive. But that doesn’t mean it ought to be preserved in amber.”




Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat

Facebook engineer questions need for ‘IsLoggedIn’ standard; supporters say it would curb tracking

Policy debate among publishers, ad-technology companies and web-browser software makers continued this week during voluntary-standards discussions hosted by the World Wide Web Consortium (W3C), highlighted by news accounts and op-ed discussion.

In a virtual meeting of the W3C’s Privacy Community Group — frequented and largely lead by developers at web-browser companies — a Facebook engineer pushed back on the idea that browsers should make decisions about storing a user’s “logged in state.” The “IsLoggedIn” proposal under discussion would have browser software signal to the user where they are logged in. Millions of web users may not be aware they are “logged in” most of the time at Facebook or Google.

“I’m not convinced that is a problem that needs to be solved,” said Facebook’s Ben Savage.  He said IsLoggedIn amounted to “injecting browsers and the companies that own browsers into the login flow.”

“In general, being logged in really poses a situation where you are very likely to be trackable across sites,” commented Michael Kleber, a Google Chrome software engineer. “I think we want some elevated signal from a user that they are back on a site there were on before and the intent to preserve some sort of state makes sense to me.”

Underlying the W3C discussion is how to limit what’s called “bounce tracking” by ad-tech and others, without user knowledge, considered a privacy concern.

George Fletcher, an identity standards architect at Verizon Media, suggested that websites participated in federated login services should post to the web an authoritative list of all the domain names involved so that web browser software could “distinguish login flows from bounce tracking.”

One of the most authoritative trade journalists covering advertising technology summed up the browser-website struggle over user identity and privacy with this lead: “Expectations continue not to reflect reality in the business and working groups of the World Wide Web Consortium (W3C).”

Efforts by primary ad-tech software company interests to slow down the browser-data-control train advanced after a 20-company group informally petitioned the top leadership of the W3C for more input in deliberations.  That appears to have resulted in an invitation to initiate a new “Decentralized Web Interest Group.” 

The interplay includes a whimsical aspect — three proposals for how to handle user data each are referred to via avian acronyms — turtledove, sparrow and “fLoC.”  Two of them — turtledove and FLoC — took wings from browser-company developers and are designed to store and manage private user data within the browser.

The third — SPARROW — comes from a technologist at Criteo — a big ad-tech company — and proposes that user data should be stored “in the cloud” on website servers.   (ITEGA, sponsor of this Privacy Beat email newsletter, has floated an idea called “UDEX.ORG” which would lodge anonymized user preferences within a data service governed by ITEGA, which is a 501(c)3 California-chartered public-benefit nonprofit corporation).




Revising Section 230: Proper governance of the Internet should be embraced by society at large, Barrett writes

“Internet companies “enjoy a hidden subsidy worth billions of dollars” by being exempted from liability for most of the speech on their platforms.4 Roughly speaking, the subsidy is comparable to spectrum licenses provided to broadcasters, rights of way to cable companies, and orbital slots to satellite operators . . . . 

” . . .[T]he benefit Section 230 confers ought to come with a price tag: the assumption of greater responsibility for curbing harmful content. The measure should be amended so that its liability shield provides leverage to persuade platforms to accept a range of new responsibilities related to policing content. Internet companies may reject these responsibilities, but in doing so they would forfeit Section 230’s protection, open themselves to costly litigation, and risk widespread opprobrium. There’s a crisis of trust in the major platforms’ ability and willingness to superintend their sites. Creation of a new independent digital oversight authority should be part of the response . . . . 

. . . Proper governance of the internet is a cause that ought to be embraced not only by users, but by the society at large, which is now so profoundly shaped by online technology.”


Privacy Beat is a weekly email update from the Information Trust Exchange Governing Association in service to its mission. Links and brief reports are compiled, summarized or analyzed by Bill Densmore and Eva Tucker.  Submit links and ideas for coverage to newsletter@itega.org.

Share Share

Tweet Tweet

Share Share

Forward Forward




Copyright © 2020 Information Trust Exchange Governing Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp