Two key scholars, Zuboff and Hartzog say individual control is not capable of fixing Big Tech privacy abuses without governance

Privacy Beat

Your weekly privacy news update.


Two key scholars, Zuboff and Hartzog, in updated analyses, say individual control is not capable of fixing Big Tech privacy abuses without governance

Two key scholars are out with updated analyses of surveillance capitalism and the significance of “privacy.” Taken together, they see individual privacy choice as no match for Big Tech data abuses absent harms-focused regulation and governance.

First, there’s an essay by Shoshana Zuboff, whose 2019 book The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power is a foundation of concerns about Big Tech and Big Data. It takes up a whole page of the New York Times’ Sunday opinion section on Nov. 14.  It’s worth a read in its entirety. 

“The Internet as a self-regulating market has been revealed as a failed experiment,” she declares, adding later: “Mr. [Mark] Zuckerberg ‘just went for it’ because there were no laws to stop him from joining Google in the wholesale destruction of privacy.” For two decades, she writes, lawmakers “have allowed private capital to rule our information spaces during two decades of spectacular growth, with no laws to stop it.” The answer, she says, lies in “regulating extraction” of data to upend Big Tech’s business model, because, she says, “The corporation that is Facebook may changes its name or its leaders, but it will not voluntarily change its economics.”

Meanwhile, Woodrow Hartzog, the noted privacy-law expert, is author of a groundbreaking essay in the November issue of the University of Chicago Law Review. It’s entitled: “What is Privacy? That’s the Wrong Question.”  Hartzog is a professor of law and computer science at Northeastern. Most of his essay extensively analyzes — and applauds — the work of another privacy legal scholar, Daniel J. Solove of George Washington University’s law school.

(See QUOTE OF THE WEEK, below, for excerpt of Hartzog’s essay) 

Attributing logic and some words to Solove, and applying his own substantial scholarship and authorship on the topic, Hartzog says it is not necessary to come up with a one-size-fits-all definition of “privacy” in the digital world.  Rather, he says what’s need is analysis by lawmakers of the many ways information technology — by rather opaquely collecting and analyzing vast troves of consumer behavior — is affecting our personal and collective lives and governance.  

“These tools power systems that make decisions about people’s personal lives,” writes Hartzog.  He adds later: “Instead of squabbling over the binary boundaries of privacy,  people who understand privacy as more of a vague umbrella term  can leave the line-drawing question for another day and get to  work identifying problems created by specific conduct, articulating the values implicated by those problems, and crafting solutions to the problems that serve those values.”

Transparency, consent and control solutions aimed at accountability “won’t be enough to get us out of this mess,” Hartzog concludes, in part because the data ecosystem is too complicated for individual consumers to be able to comprehend how to exercise control over their data even if they had it. 

Solove’s work helps scholars and lawmakers tackle (1) how capitalistic incentives cause companies to leverage information in harmful ways (2) how the design of information technologies matters as much as data practices and (3) how marginalized populations are affect first and hardest by privacy abuses. 




Does your organization need customized privacy compliance solutions? ITEGA  can help.

We bring together support you need to approach compliance with CCPA, GDPR if needed, and future privacy legislation as it emerges.

Learn More

W3C study group offers examples of sharing data — when should it be “sanctioned” or “unsanctioned”?

A small group of technologists is continuing to work on ideas which could lead to industry standards for the sharing of user data with a goal to “prevent untransparent, uncontrollable tracking of users across the web.” The Federated Identity Community Group is meeting under World Wide Web Consortium (W3C) rules. 

At issue is how web-browser software determines when to allow the sharing of user behavior among multiple websites — and whether browsers should even exercise such control. Key questions in meetings so far: Should corporate-affiliated websites be allowed to share data, while other more loosely affiliated sites cannot? For what purchase should data be shared? For how long? 

Twelve companies were among those represented on a Nov. 12 virtual meeting, public minutes show, including Google, dstillery, Salesforce, Adobe, Auth0/Okta, Yahoo Inc., Microsoft, DAASI International, W2O2, PIng and Yubico.  Their representatives are writing and discussing examples of how they share data now, or may, with the goal of deciding what should be “sanctioned” and what is “unsanctioned” and potentially blocked by browser software. 

Included is  prescription from Yahoo Inc., for the presumptive operation of “Federated Single Sign On” that would string together user data from across hundreds of millions of users of sites like TechCrunch, AOL and Yahoo, among others. And a Microsoft “story” explains how a user’s home based identity provider can control log ins and log outs at remote sites. 



Twitter unveils ad-free pages and revenue for publishers

REPORT: Lotame’s “Beyond Cookies” survey says publishers are motivated by data privacy to find alternatives 

One of the premier U.S. data-tech companies, Loatame Interactive, says it surveyed 200 decision makers for a “Beyond the Cookie” report as it seeks to promote it’s own solution for tracking users across sites called “Panorama ID.” 

Two of the report’s findings are particularly noteworthy: 

  • While marketers say their primary reason for adopting new identity solutions is to support audience targeting (52%), for publishers, the central reason is data privacy (59%). 
  • Email-based identity solutions (63%) were the most popular choice when asked what types of ID solutions marketers and publishers were planning to test in the next six months to one year. Contextual (49%) was in second place, followed by cohorts (30%) and probabilistic (29%). 






Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat


Privacy is a concept in disarray, but that’s OK, because lawmakers need to widen its scope and values served 

  • Below are excerpts from a University of Chicago Law Review essay by Woodrow Hartzog, a professor of law and computer science at Northeastern University.  (See the lead item in today’s Privacy Beat for a summary).  

“Throughout history, privacy has evaded a precise meaning.  Initially, lawmakers had no compelling need to give the concept a  singular legal definition. The earliest personal information and  surveillance rules and frameworks for privacy leveraged specific  concepts such as solitude, confidentiality, and substantive due  process . . . .

“Daniel Solove, the John Marshall Harlan Research Professor  of Law at the George Washington University Law School and perhaps the most prominent and influential privacy scholar of our  day, wrote at the turn of the millennium that privacy was “a concept in disarray.” 

 “ . . . In the twentieth century, privacy theorists seemed intent on crafting a definitive,  singular meaning for privacy . . . But it turns out that a broad and singular conceptualization  of privacy is unhelpful for legal purposes. It guides lawmakers  toward vague, overinclusive, and underinclusive rules.  It allows  industry to appear to serve a limited notion of privacy while leaving people vulnerable when companies and people threaten notions of privacy that fall outside the narrow definition. And it  often causes people who discuss privacy in social and political settings to talk past each other because they don’t share the same  notion of privacy.

“In an ongoing series of articles and books starting in 2001, Solove  worked to reshape the entire narrative around privacy by suggesting that we stop obsessing over what privacy is and start asking what privacy is for? . . . Solove argued that automated systems fueled  by personal data don’t just power surveillance tools. These tools  power systems that make decisions about people’s personal lives. 

 “ . . . Most of our modern data privacy rules, however, are built to  serve individualistic notions of privacy—that is, to respect a person’s autonomy and dignity. Few are aimed at disrupting power  disparities between people and companies,  protecting individuals from harassment and manipulation, or seeking a collective  wellbeing for a diverse population in which many people, including women, people of color, members of the LGBTQ+ community,  and others, are particularly vulnerable to information systems . . . .

“Transparency, consent, and control solutions won’t be enough  to get us out of this mess. First, as Solove has noted, the ‘privacy  self-management’ approach embodied by notice and choice regimes puts the onus on individuals to protect themselves.  But  the massive scale and widespread adoption of digital technology  have made meaningful informational self-determination impossible. People are simply overwhelmed by the choices presented to  them. The result is a threadbare accountability framework that  launders risk by foisting it on people who have no practical alter native to clicking the “I Agree” button. 

“Second, consent and control are a poor fit for certain information problems, like manipulation and harassment, that have little to do with how  information is processed and more to do with how mediated environments put people at risk.  Finally, seeking to give people control over their personal information doesn’t account for collective,  societal harms from personal information technologies.

“Lawmakers have started to embrace privacy as a concept  with multiple overlapping dimensions. Legislators and regulators  have begun to target problems such as nonconsensual pornography, microtargeting, manipulative user interfaces, and automated decision-making with innovative rules leveraging second ary liability for dangerous and abusive design choices, substantive limits on data collection and use, relational duties of loyalty and care, equitable relief, and criminal penalties – in  addition to implementing outright bans on particular  technologies. Judges are also evolving in their thinking about privacy.

“The year is 2021, and privacy is still a concept in disarray.  But that’s okay. There is now too much data that is collected by too many different entities and used in too many different ways  for any singular definition of privacy to be legally useful anyway.  Daniel Solove’s work on understanding privacy has imposed order  upon chaos, shifting our focus away from questions about what  privacy is and toward the different problems we want our privacy based rules to address and the specific values we want them to serve.”


Privacy Beat is a weekly email update from the Information Trust Exchange Governing Association in service to its mission. Links and brief reports are compiled, summarized or analyzed by Bill Densmore and Eva Tucker.  Submit links and ideas for coverage to

Share Share

Tweet Tweet

Share Share

Forward Forward




Copyright © 2021 Information Trust Exchange Governing Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp