W3C debating: Should log-in to multiple websites be considered “sanctioned” or “unsanctioned”?

Privacy Beat

Your weekly privacy news update.


Methods for sharing user data (W3C Federated ID group slide)

W3C debating: Should log-in to multiple websites be considered “sanctioned” or “unsanctioned”? Should browsers block?

A public World Wide Web Consortium (W3C) group largely participated in by Google and Microsoft continues its work of proposing ways in which users can “log in” to multiple websites after third-party cookies are disabled, sometime during 2023.  The Federated Identity Community Group’s meeting notes and discussion documents are generally publicly accessible.

An open question in the group continues to be whether such a federated login, or Single Sign On (SSO) should be considered to be “sanctioned” or “unsanctioned” tracking which should be blocked by web-browser software or other blocking means, notes of the Dec. 3 meeting and a related “user story” discussion document show. 

A slide-deck (see above) prepared for the federated ID community group by Microsoft’s Tim Cappalli and Auth0/Okta’s Vittorio Bertocci usefully defines the subjects the group is considering as: third-party cookies, link decoration, bounce tracking, identity-specific APIs and browser assuming an active role in identity flows. 

The question of what constitutes “tracking” across the web is not settled. A second, larger W3C working group, the Privacy Community Group, is working on it,  notes from an October meeting show.  The Privacy CG’s next virtual meeting is at noon, U.S. EST on Thursday, Dec. 9 (login details).  It is open to the public and currently has over 400 registrants. It’s last meeting, on Oct. 14, included a rich discussion of tracking without  user authorization.




Does your organization need customized privacy compliance solutions? ITEGA  can help.

We bring together support you need to approach compliance with CCPA, GDPR if needed, and future privacy legislation as it emerges.

Learn More


Newmark-funded Aspen Tech Policy Hub offers over $75,000 for projects to fix “information disorder”

A unit of the Aspen Institute announced it was offering over $75,000 for “unique and innovative projects that make meaningful progress toward ending information disorder.” It’s the next step in the work of a Commission on Information Disorder, established by the Aspen Tech Policy Hub and funded by Craig Newmark Philanthropies. 

In a Nov. 15 report, the West Coast-based policy hub made 15 recommendations including adopting a U.S. federal-government response; setting up an independent non-profit organization at the intersection of technology, democracy and civil rights with an education and  local focus; citizen empowerment around media literacy education; and changes to Section 230 of the Communications Decency Act.  



WSJ details marketers collecting user data as other sources dry up; but what can the data be used for?

A richly researched Wall Street Journal article surveying big advertisers finds that they are rushing to develop so-called “first-party” relationships with their customers and shopping browsers out of fear that they will loose access to third-party tracking data from the ad-tech industry.  The story illustrates how the market for user data collection is beginning to shift. 

“Consumer packaged goods companies, in particular, will likely struggle to get meaningful quantities of data, since many don’t sell directly to their customers,” writes WSJ reporter Suzanne Vranica in the nearly-full-page roundup. She says marketers are using loyalty programs, sweepstakes, newsletters, quizzes, polls and QR codes to collect user data — primarily email addresses. 

“Such a record might include dozens, even hundreds, of data points, including the store locations people visit, the items they typically buy, how much they spend and what they do on teh company’s website,” Vranica wrote in the piece published Dec. 3.  A key issue for marketers will be staying clear of European and California data-privacy laws, which may complicate how they use, share or sell such collected data. 





Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat


Denham says they key thing people care about: Trust in the rules and regulations governing technology 

  • The following are excerpts of a speech by outgoing British information commissioner Elizabeth Denham, delivered to the Chartered Institute for IT.  She took office in 2016. 

” . . . Clearly much has changed since 2016. The past couple of years have seen an acceleration in the take-up of digital services, alongside a growing awareness of privacy rights and their value. And that’s made my world, and your worlds, look a whole lot different.

” . . . A key message of that first speech was that organizations should not be thinking about privacy or innovation, but about privacy and innovation. That remains as true today as ever. It was central through the pandemic, as the value of data protection as an enabler shone through, encouraging people to trust innovation by showing that their views are being respected.

” . . . [T]he value of data protection as enabling innovation is greater than ever, whether that’s in enabling contact tracing apps or in protecting firms from cyber attack. Privacy, cyber security, considering the impact of digital innovation – these are all board level concerns.

“What is crucial, though, is that amid the pace of change, we don’t forget this relationship between innovation and privacy . . . . Transparency is key here. And by that I mean real transparency: sensible explanations of how data is being used, the benefits that will result, and taking the time to check people understand. Regulation plays an important role here too.

“[A] study earlier this year showed the single biggest predictor of whether someone believed in the role of digital innovation in response to the pandemic was not their level of concern about the pandemic, nor their age or education. It was trust in the rules and regulation governing the technology . . . But no matter how the technology evolves and the world changes, any future onstant.

” . . . With that in mind, I am deeply concerned about any changes to the data protection regime that would remove the centrality of fairness in how people’s data is used. I am thinking specifically of AI and algorithms here, and questions in the consultation about the TIGGR proposal to remove the right to human review of automated decisions. This feels like a step backwards.”

“AI and algorithms rely on the data that is fed into them, in order to produce the world-changing outputs that come out the other end. Put simply, if people start to mistrust those outputs, then they’ll start to block their data being used as an input. Building that trust starts with transparency, and continues in a commitment to fairness wherever people’s data is used. Without that trust, we risk losing so many opportunities that technologies can offer our society.”


Privacy Beat is a weekly email update from the Information Trust Exchange Governing Association in service to its mission. Links and brief reports are compiled, summarized or analyzed by Bill Densmore and Eva Tucker.  Submit links and ideas for coverage to newsletter@itega.org

Share Share

Tweet Tweet

Share Share

Forward Forward




Copyright © 2021 Information Trust Exchange Governing Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp