Can multistakeholder governance help avoid platform “capture” of web? | If IAB Tech Lab won’t control UID2, who will?

Privacy Beat

Your weekly privacy news update.

VIEW IN YOUR BROWSER

 

(ABOVE: Screen capture, March 24, 2022)

Multistakeholder governance to stop web “capture” by dominant intermediaries? Clues in Berjon’s expert personal post

Technologies governed by multi-stakeholder organizations that prevent dominance of a single business intermediary are needed if a “capture” of the open web by platforms like Google and Facebook is to be avoided, suggests Robin Berjon, a respected Internet standards expert and data governance vice president at The New York Times.

Writing in a personal blog post entitled  “Capture Resistance” — which he cautions should be regarded as personal opinion and not Times policy — Berjon says the commonly expressed view that “centralization” of the web is inherently bad requires a more nuanced view. What’s important, he says, is that the sharing or processing of personal data be done under rules and with technology that supports personal privacy and avoids web “capture” by one or two big companies. “Infrastructure neutrality” is needed, he argues.

“Information is power and a company developing the capability to observe the behavior of users on another business’s products is a capture vector,” writes Berjon, adding a bit later, “We need to grow our common understanding of privacy beyond a narrow understanding of personal data and extend it to the governance of data flows . . . . “

He writes that culprits in the extraction of personal information might be web-browsers software or operating systems, among others.  “A massive and persistent capture dynamic has centralized gatekeeping power in the hands of two companies and change (one push, one pull)” he writes in an apparent reference to Alphabet-Google and Meta-Facebook.

“Privacy is first and foremost about people, but it also needs to protect businesses against the misappropriation of trade secrets,” writes Berjon. “We need to grow our common understanding of privacy beyond a narrow understanding of personal data and extend it to the governance of data flows that impact people.”

Privacy and other “nutrition labels” should be part of “standard choice dialogues” between consumers, web browser and search services, writes Berjon, and “for historical reasons, this will also require better financing models for browsers, a hard conversation that is long overdue and must happen if we want a diverse and pluralistic web instead of one captured by a self-reinforcing browser/search dynamic.”

Currently the Chrome browser, controlled by Google, dominates web access, followed by the Safari browser controlled by Apple. With much smaller user bases are the Microsoft Edge browser, the Mozilla Firefox browser and the Brave browser. All are controlled by for-profit companies except Mozilla, which is a nonprofit organization.

PLATFORMS AND PRIVACY

WASHINGTON WATCH

FTC AND PRIVACY 

ITEGA’s mission: Trust, identity, privacy and information commerce.

ITEGA calls for support of ‘public option’ user privacy/identity ecosystem — led by journalism-aiding nonprofit

Learn More

 

ABOVE, Nonprofit Consumer Reports offers data-privacy/security services to the public

More confusion for publishers, advertisers after IAB Tech Lab declines to be UID2 identity/privacy enforcer

Confusion keeps growing for publishers, advertisers and others who seek information about who is visiting their web pages — and if its possible to balance tracking with privacy.

  • The latest challenge was announced last week, when Google said it will stop logging IP addresses within Google Analytics, a widely used cloud service that provides data on web-site traffic.  The changes will take place in 2023 and appear to be an effort by the platform to address privacy concerns raised by European Union regulators and activists. The Google decision will potentially make it harder for Google Analytics customers to “track” user behavior.
  • That follows a decision announced Feb. 28 by the Interactive Advertising Bureau Tech Lab not to get involved in policing the use of email-based identifiers which are shared among ad-tech companies, advertisers and publishers. The IAB Tech Lab board’s decision means there is no legal or industry overseer of privacy when it comes to sharing user identity attributes, leaving Google and to some extent Facebook free to do so as they wish.
  • Gannett Corp. voluntarily disclosed that it had billed millions of dollars of digital advertising which appeared on its regional newspaper sites, rather that USAToday.com, as advertisers were led to believe.

“Gannett’s mess-up exposed the elephant in the room,” wrote ad-tech analyst Augustine Fou in a LinkedIN analysis of the disclosure entitled: “Time to Acknowledge Ad Fraud, Instead of Countless Oopsies,” which he said pointed out that there is, in his opinion, no independent regulation or verification of digital ad placements. He said the disclosure suggests “1) that no one was looking, 2) the fraud detection tech doesn’t work, 3) that fraud detection worked but everyone ignored it anyway . . . Yeah, we’re looking at you — the entire ad tech ecosystem.”

IAB Tech Lab’s decision — “Tech Lab has decided it will not be taking on the technical administrator role for UID 2.0 at this point” — was disclosed by CEO Anthony Katsur deep in a blog post and picked up and covered by AdExchanger’s Allison Schiff. The problem for the Tech Lab is fear that administration would entail having to “disconnect” ad-tech companies violating privacy use restrictions on the email identifiers, provoking lawsuits or upsetting corporate members of the nonprofit Tech Lab. Instead, it is appears focused on helping create protocols and standards, without an enforcement role.

RELATED LINKS: 

AD TECH AND PRIVACY

PRIVACY AND IDENTITY

PERSONAL PRIVACY 

FACIAL RECOGNITION

 

In Electronic Frontier Foundation interview, Edward Snowden highlights lack of bitcoin privacy

Edward Snowden, living in Russia after his 2013 disclosure of massive U.S. government surveillance of phone and internet data awakened public concern about privacy and surveillance, has advanced a contrary view about the digital currency system “bitcoin.” He says it isn’t private, contrary to some of its early hype.

“Bitcoin is not an anonymous ledger, it is truly a public ledger, and those things are always out there,” Snowden says in a video interview with Electronic Frontier Foundation (EFF) attorney Marta Belcher, posted last week.  He tells Belcher that even though the digital token is theoretically “anonymous” as a method of exchange, the recording of transactions in multiple permanent data records, called “ledgers”, means that insiders and governments with enough computing capacity can trace transactions.

Reporter Shawn Amick, writing about the EFF interview at Bitcoin Magazine’s online site  says companies such as Elliptic have an entire business model built on tracking bitcoin and other cryptocurrency transactions. “You get chain analysis people and whatnot who are doing fairly devious things with it,” Snowden says, to achieve business advantage.

PERSONAL PRIVACY 

PRIVACY BUSINESS

MEDIA AND PLATFORMS

CALIFORNIA PRIVACY 

Like what you see? Then recommend to a friend.

Subscribe to Privacy Beat

Utah Consumer Privacy Act gubernatorial fate not-yet announced as of March 24 

Utah Gov. Spencer J. Cox had until March 24 to sign into law Senate Bill 227, the Utah Consumer Privacy Act. A news release from the governor’s office, dated and posted March 23, did not list the bill among those he had signed.  (TEXT OF UTAH ACT)

STATEHOUSE BEAT 

EU & UK PRIVACY 

US-EU PRIVACY SHIELD TALKS

GLOBAL PRIVACY

UPCOMING EVENTS

QUOTE OF THE WEEK 

Berjon: Governance and neutrality — even if “centralized” — can help eliminate web concentration of power

“A decentralized system, in the sense of not having a topologically central point of control, is neither necessary nor sufficient to eliminate the excessive concentration of power that a party may hold over others in the system. Hypertext is decentralized but the Web can still be captured by controlling discovery or browsing. Token-based governance may share power but is susceptible to plutocracy. Conversely, a centralized system operating under institutional rules of democratic access can (imperfectly) resist the excessive accrual of power — see for instance Wikipedia.

“A recent and excellent IETF draft from Mark Nottingham defines centralization as “the ability of a single entity (e.g., a person, company, or government) — or a small group of them — to exclusively observe, capture, control, or extract rent from the operation or use of a Internet function . . . . ”

“ . . . Centralisation results in another entity being able to overpower your ability to operate your system according to your intentions. It develops over time through various types of capture that include, as listed in the IETF draft, observing your operations (breaching confidentiality, appropriating trade secrets such as the behavior of people on your property), controlling your system (forcing you into choices that are inferior from business and technical perspectives), or extracting rent which, by any meaningful measure of its impact, is often indistinguishable from extracting ransom . . . .

” . . . A common component of capture attacks is to statistically rig the game so as to generate capture effects involving large populations even though, technically, any one of those people could have resisted the statistical nudging. A good example of this is using defaults to move laterally and project power from one space into another rather than compete on the merits. (Defaulting the search engine, browser, or mapping provider are all typical examples of lateral capture.) . . .  when capture succeeds (even partially), the operational performance of the technological system being captured is degraded because its owners lose control over it, lose revenue from it, and fewer people have access to better products or better technology. An architecture that is susceptible to capture has a vulnerability; and we can explore capture resistance through threat modeling and develop best practices for mitigation . . . First, information is power and a company developing the capability to observe the behavior of users on another business’s products is a capture vector. More generally, extracting data across contexts is a major contributor to anti competitive advantage.

“Privacy is first and foremost about people, but it also needs to protect businesses against the misappropriation of trade secrets. We need to grow our common understanding of privacy beyond a narrow understanding of personal data and extend it to the governance of data flows that impact people, including of actions that take place at a remove from personal data, for instance after anonymisation or through on-device methods. Preventing data from leaking, strictly forbidding user agents such as OSs and browsers from collecting data beyond basic telemetry and crashes, and enforcing purpose limitations whenever data has to be shared are all important mitigations . . .  We must prevent bad actors from using defaults to project power laterally between markets. The Web community should develop standard choice dialogs for browsers and search, including nutrition labels to help guide people make informed decisions over aspects of the options they cannot readily assess for themselves (eg. for privacy). For historical reasons this will also require better financing models for browsers, a hard conversation that is long overdue and must happen if we want a diverse and pluralistic Web instead of one captured by a self-reinforcing browser/search dynamic . . . .

” . . . Repairing the damage this has caused will require developing a sounder understanding of gatekeeping power, ridding the tech community of the dangerous delusion that neutral relevance exists, and developing sustainable interoperability in social, search, and aggregation . . . . .

” . . . We also need to do work to prevent opaque mechanisms from being used in place of open markets. Much has been said about the US State Attorneys General’s complaint that alleges that Google ran third-price auctions which they claimed were second-price and pooled the difference to inflate other bids to help make its offering look better than competitors’ and reap the benefits from winning more bids. Whether these allegations are true is for the courts to decide, but the fundamental problem is that the allegations should not be possible. It should be obvious to all involved in a market and easily proven, especially in a market that is a key infrastructrural component of the Internet, how auctions are operating and who wins what under which terms . . . .

” . . . Replacing a market with opaque mechanism design has two consequences. First, it captures the market since the mechanism will be optimized to benefit its designer rather than as a public good. Second, it captures the participants in the transactions because, in the absence of the information that a market normally provides, they lose the ability to take rational initiative and have to put their trust in charlatans — a situation that will sound familiar to those who know digital advertising . . . . 

” . . . The more general issue of which replacing markets with designed mechanisms is an instance is that of infrastructure neutrality. When infrastructure ceases to be neutral, which is to say when the infrastructure system uses data about its users in order to manage their behavior (an approach often labeled “smart”), it becomes decreasingly possible for them to act freely, rationally, and to take risks or innovate. Without infrastructures neutrality, the companies that operate on it lose the ability to learn about their environment, to adapt to it, or to develop a vision for their future because they are being too shaped by the infrastructure. … .A key requirement here is that any infrastructural layer must be subject to either exit or voice (or both), which is to that either it must be easy and practical for participants to leave one infrastructure provider for another (thanks to portability and interoperability arrangements) or they must be a stakeholder with genuine influence (at the very least a modicum of control such as voting in their constituency) over the governance of the system. This applies to app stores just as much as it does to advertising infrastructure . . . .

” . . . The danger of intermediation in particular cannot be stressed enough. Given the fast-growing complexity of our digital lives, we are facing with high probability and inside of a few years the dystopia of a single company inserting itself as the broker for all of our transactions with third parties. As a community, we need to become systematic and unforgiving in hardening our architectures against capture and we need to do it proactively rather than reactively. This requires an in-depth understanding of attack vectors and their mitigation.”

ABOUT PRIVACY BEAT

Privacy Beat is a weekly email update from the Information Trust Exchange Governing Association in service to its mission. Links and brief reports are compiled, summarized or analyzed by Bill Densmore and Eva Tucker.  Submit links and ideas for coverage to newsletter@itega.org.

Share Share

Tweet Tweet

Share Share

Forward Forward

Facebook

Twitter

Website

Copyright © 2022 Information Trust Exchange Governing Association, All rights reserved.

Want to change how you receive these emails?
You can update your preferences or unsubscribe from this list.

Email Marketing Powered by Mailchimp