By Bill Densmore | Executive Director
Information Trust Exchange Governing Association
Posted: June 29, 2018 | 12:50 p.m.
The new California Consumer Privacy Act of 2018 (CCPA), signed on Thursday by Gov. Edmund G. Brown Jr., vaults the nation’s largest state into the forefront of regulating how personal data is collected and used.
“If California leads the way, the rest of the nation will follow,” said Elizabeth Galicia, of Common Sense Kids Action, one of the new law’s Sacramento supporters. The law enshrines California’s attorney general as the state’s top personal-data enforcer, in what a Silicon Valley lawmaker called the role of “sheriff in the Wild West” of digital privacy.
And a provision quietly added just days before passage may also begin to bring the value of personal data out of the shadows. That’s because the law, when it takes effect Jan. 1, 2020, will allow tech and data firms to either pay consumers for the use of their data, or charge them for the privilege of remaining anonymous.
Supporters of the idea say it could help pay for content and web services. Opponents say it may mean that some people will be able to afford to buy their privacy, and others will not. And because the concept is to some degree in conflict with other sections of the new law, clarification by the California attorney general, amendments, or eventual litigation are likely.
The result could be the emergence of a digital marketplace for valuing personal data according to John W. Simpson, the privacy-and-technology project director for Consumer Watchdog, a California advocacy nonprofit that backed the measure. “It could put some kind of interesting data market in place.”
“There are some who have said that if you say ‘no’ [to use of your data] they should still have to give you service,” adds Simpson, who began his career as a journalist. “Probably in the real world that is just not going to happen. This way they cannot just shut you off . . . and consumers would have an understanding of what the company is asserting that their data is worth.”
Legal and privacy experts, including Alan L. Friel, an attorney with BakerHostetler, agree that is the likely effect of the language. Friel’s firm has created a bullet-point analysis of key features of the new law.
Overall, the CCPA gives California residents the right to access, obtain for free and order the deletion of personal information about them — or about their household — collected by larger data and internet companies, the right to be informed what categories personal information are collected or created about them, and the right to learn whether their data is being sold to “categories” of third parties. Data collection is defined to include passively observing the consumer’s online behavior. The law also requires website to put a clear, conspicuous link on their homepage entitled, “Do Not Sell My Personal Information” which enables opt-out.
NON-PROFIT NOT COVERED?
A loophole appears to exist for smaller outfits and for public broadcasting, media and other nonprofits which handle consumer personal information. Non-profit organizations are not subject to the law, because it defines affected businesses as “organized or operated for the profit or financial benefit of its shareholders or other owners.”
To be subject to the laws privacy obligations, a for-profit company must have revenues excess of $25 million a year, or deal with PI of more than 50,000 consumers or get 50 percent or more of their annual revenues from selling consumers’ personal information.
The CCPA was whisked into law over four days in Sacramento because industries which trade in consumer data were terrified that a citizen petition with the same name would go on the state’s November ballot. As the clock ticked on a 5 p.m. Thursday deadline, they began compromise negotiations with the ballot measure’s chief proponent, San Francisco real-estate developer Alastair Mactaggert. In the end, Mactaggert backed the bill and withdrew his initiative, and the result was that it passed the Legislature unanimously. “I feel like it’s the first step, and the country’s going to follow,” Mactaggert said after the bill became law. “Everybody is finally waking up to the importance of digital privacy.”
The result, is a landmark law far more favorable to consumers than anything else in the United States so far, and similar in some respects to Europe’s General Data Protection Regulation (GDPR) and ePrivacy rules. In negotiating to kill the initiative and replace it with legislation, many things ended up more clear — and more favorable — for data aggregators.
A week before Thursday’s signing, the compromise language emerged and was superimposed over a shell bill on a related topic that had been tabled a year ago. As late as Sunday night — four days before it became law — the language was reportedly tweaked in negotiations lead by a TechNet lobbyist over a weekend working with lobbyists for Google, Facebook, Amazon, AT&T, and the Interactive Advertising Bureau, among others.
With many tweaks favorable to data aggregators, the new law is tougher in at least one respect. While companies can collect data on adults until told not to (“opt-out”), no data can be collected on consumers under 16 until they affirmatively “opt-in” to collection, or until their guardian does so for them if they are under age 13.
MARKET FOR PERSONAL DATA?
The most innovative feature of the bill is the way in which it acts to create a marketplace for the valuation of personal information. The new law prohibits service discrimination based upon a consumer’s withdrawal of the right to collect or use personal information about them. But it then goes on to say, at Section 1798.125, that this prohibition does not apply in all cases:
“Nothing in this subdivision prohibits a business from charging a consumer a different price or rate, or from providing a different level or quality of goods or services to the consumer, if that difference is reasonably related to the value provided to the consumer by the consumer’s data. A business may offer financial incentives, including payments to consumer as compensation, for the collection of personal information, the sale of personal information, or the deletion of personal information. A business may also offer a different price, rate, level or quality of goods or services to the consumer if that price or difference is directly related to the value provided to the consumer by the consumer’s data.”
“Right now we’re used to advertiser-supported content,” Friel, of BakerHostetler, said in an interview. “If you are going to severely restrict advertising [by making tracking and personalization a consumer opt-out] you are going to have to pay for it some way. That is a big distinction between the bill and the initiative.”
REASONABLE SUBSCRIPTION CHARGES
The law leaves it up to the California attorney general to write rules about determining the value of user personal information, but the law says a subscription charge or a payment to acquire user personal information has to be “reasonably related to the value provided to the consumer by the consumer’s data.” The bill says such financial incentive practices cannot be “unjust, unreasonable, coercive or usurious in nature.”
The concept of charging consumers who elect privacy — what a bill analyst terms “financial incentive programs” — has been discussed in theory for years and is controversial, in part because the California Constitution gives state residents an “inalienable right to pursue and obtain privacy.”
An official legislative analysis of the measure, Assembly Bill 375, notes:
“These provisions arguably can contribute to the transformation of a constitutional right into a luxury product that is affordable by a select few, creating unequal access to privacy and further enabling predatory and discriminatory behavior. This is a constitutional right that the Legislature should not commodify lightly.”
“I believe this path to pay for privacy is a dangerous and slippery slope,” said California state Sen. Hannah-Beth Jackson.
ADVERTISERS OPPOSED BILL
As of a June 27 legislative hearing, there were five registered supporters of the new law and 31 opponents. The registered supporters are Common Sense Kids Action, CalPIRG, Center for Humane Technology, Consumer Watchdog and Consumer Attorneys of California.
Listed opponents included the Association of National Advertisers, TechNet, the Internet Association, the National Retail Federation, the California Chamber of Commerce, the Media Alliance, and other lobby groups representing insurance, banking, cable operators, grocers, hospitals and restaurants.
The staff report (PDF) on the June 27 hearing said the Media Alliance argued the new law would weaken the protections that would have been in the ballot initiative. Specifically, it raises issue with what it characterizes as “codifying price discrimination for privacy.” It also argues in opposition because of the narrowed definition of “sale” which eliminates application of the bill to the transfer any transfer of data for which valuable consideration has not been provided. Furthermore, Media Alliance objects to the consumer’s limited right of remediation under a narrowed private right of action.
In testimony hours before the bill became law, the California Chamber of Commerce said it was worried about significant enforcement and court costs and what its lobbyist, Sarah Boot, called “massive liability and other problems with this bill.” But she said the chamber was even more opposed to the initiative petition and would be working to improve the law through amendments before it takes effect in 18 months. She said the law as now in place could:
- Jeopardize operation of loyalty reward programs
- Complicate product-safety recalls
- Inhibit third-party data sharing for fraud protection
- Permit one consumer to receive information about another
WHAT IS ‘PERSONAL INFORMATION’?
While there is ample specific language seeking to define “personal information,” (see Section 1798.140 of the law) there are nuances which may likely have to be clarified or litigated. (For a definition and full list of personal-information categories, see the legislative analyst notes — at Page 6 — at the bottom of this post)
- The bill’s comprehensive definition of “personal information” appears fairly similar to that of the initiative petition. A key point in both is that information about an individual obtainable from federal, state or local government databases is by definition not personal information. The definition was tightened in the bill version because tech companies did not want it to include public social-media profiles.
- The bill also defines as personal information inferences drawn from a person’s personal data or online activity used to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
However, a reference to “psychometric information” in the petition, was deleted from the bill that just became law. A Wikipedia entry terms psychometrics “a field of study concerned with the theory and technique of psychological measurement. As defined by the National Council on Measurement in Education, psychometrics refers to psychological measurement.”
- Also excluded from the new law’s definition of personal information, (but in the initiative) are “probabilistic identifiers” — described as “the identification of a consumer or a device to a degree of certainty of more probable than not . . . . “
One bit of data which is not listed or defined as personal information — your phone number.
NEW RIGHTS FOR CONSUMERS
Both the bill and the initiative spell out new legal rights for consumers and obligations of companies that collect — or infer — personal information:
- Consumers have the right to demand disclosure of what types of personal data are collected or created, how it is used, and what types of third parties it is being sold to, if any. Consumers can order a company to stop collecting and to delete personal data on them.
- Companies are required to respond within 45 days if they can “verify” the request is from a real person (not a company) about which they hold personal information. They do not have to give the names of specific third parties data has been sold to. The company doesn’t have to respond if it can’t “reasonably verify” that the person making the request is a natural person about which the company holds personal data.
- The bill only gives rights to “consumers” and consumers are defined only as California residents. Likewise it applies only to companies collecting or processing data within the state.
- No disclosure of a personal data “sale” is required if the company who collected the data transfers it elsewhere without material compensation for doing so. In other words, data “sharing” is not covered in the bill as it was in the Mactaggert initiative. The bill also does not require disclosure of data exchanges with non-profit organizations, or for academic research.
- The law, unlike the initiative, requires that a third-party making a request on behalf of a consumer has to be a “natural person” — it’s not clear if this would make mass inquiries by consumer or privacy groups less feasible.
- Tech may be seeking to make sure that if you want data about you or want selling of your data stopped, you have to be a registered user. Gone from the law is this sentence in the initiative: “A business shall not require a consumer to create an account in order to direct the business not to sell the consumer’s personal information.”
DIFFERENT APPROACH TO OWNERSHIP AND ‘CONTROL’ AND USE
The law appears to largely avoid the term “ownership” when describing personal data; The initiative had assigned ownership of personal data to the user. That appears to be a big conceptual change from MacTaggert initiative, leaving ambiguous in California and elsewhere the notion of ownership and what one might “own.” Ownership implies control, and the tech companies may not want to be told they are handling data which a consumer “owns.”
Also, the thrust of the initiative was to imply control and provide a means to stop the use of personal data. The initiative read, for example: “You should be able to control the use of your, and your children’s personal information, and be able to stop businesses from selling your information.” The “your data” language – implying ownership by the consumer – is absent from the bill, as are any promises of control by the consumer.
You can see it, you can in some cases require it be deleted, but you aren’t told you can “control” it. Control implies ownership and it is likely tech lawyers don’t want to cede “control” to the consumer.
The bill also appears to steer clear of defining what is meant by “use” of data, focusing instead on disclosure and removal. The point may be to avoid getting into legal arguments over what constitutes “use.” In the bill a business or service provider is given a set of “necessary to do business” exemptions under which they do not have to comply with a consumer’s request to delete personal information. Thus some uses are reserved for the company and not subject to control by the consumer.
CAMBRIDGE ANALYTICA EXEMPTION?
The rules which restrict the use by a “third party” of personal information do not apply to a “service provider” who has a proper contract with the first-party data originator. So a sort of “safe harbor” exists for companies which sell data under contract (with disclosure to the public, however). The sale of data to a defined “third-party” (not under contract) is the only class of data use that a consumer can order “stopped” under the bill.
One aspect of the law would appear to have exempted Facebook in California for any liability for misuse of its users’ personal information which fell into the hands of Cambridge Analytica.
If a first-party data acquirer (think Facebook) “does not have actual knowledge, or reason to believe” that an entity to which it gives user data intends to violate the law, the first-party acquirer is not legally libel for the subsidiary user’s violations. This safe-harbor language was not in the initiative petition language.
DAMAGES REDUCED, CONSTRAINED
There is some evidence of efforts in the bill to lower the financial penalties from illegal behavior by data companies. First, the law limits private rights of action by consumers only to cases of mass data breach, not failure to follow rule about use of personal information.
The civil damages available to a consumer per incident in the bill max out at $750 per person. In the petition it was $3,000. When an injury is claimed by a consumer, a company can avoid liability of the claimed injury is cured within 30 days of being cited by the consumer. However, in exercising enforcement powers, the attorney general can fine a business up to $7,500 for each violation of the law.
The initiative had defined the unauthorized use of personal data as an “injury in fact.” The new law avoids this language, and requires a consumer bringing suit to prove actual damages – a much tougher job in court. The initiative had said “the consumer need not suffer a loss of money or property as a result of the violation in order to bring an action for violation of this Act.”
CALIFORNIA ATTORNEY GENERAL — THE SHERIFF?
Another huge change favorable to tech data aggregators is that the California attorney general basically controls enforcement; citizens can ask for privacy relief from a company, but if they want to sue, they have notify the attorney general’s office and wait 30 days for a response.
The attorney general can then either take on the complaint and pursue it in behalf of the public, or decline to take action, leaving the consumer free to commence litigation. A third option in the bill simply states the attorney general can order the consumer not to proceed. The official legislative analysis of the bill describes this as problematic and probably unconstitutional – the idea that the AG can just tell someone they can’t bring suit.
“We need a sheriff in the Wild West, and this is the first step to put some regulations around that,” said California state Sen. Jerry Hill, during a final hearing on the law on July 28 in Sacramento, as he spoke about the new role for the attorney general. Hill’s San Mateo district is in the heart of Silicon Valley.
“Whoever the attorney general is in the state of California is going to be the chief privacy officer for the [country] given the number of technology companies here,” said state Sen. Robert Hertzberg, D-Van Nuys, the new law’s co-sponsor. He said he thought it was “intelligent and cost effective” to put enforcement in the attorney general’s hands.
To emphasize the AG’s authority, the law preempts any state or local jurisdiction in California from making law in the areas covered by the privacy act, closing off for consumers an avenue to seek tougher rules locally.
WHY WAS THE TERM “SHARING” DROPPED?
The term “sharing” appears in the initiative as similar to sale of personal data. But the word “sharing” appears to be omitted from the new law. The bill defines selling as “selling, renting, releasing, disclosing, disseminating, making available, transferring, or otherwise communicating orally, in writing, or by electronic or other means, a consumer’s personal information by the business to another business or a third party for monetary or other valuable consideration.”
The initiative had a similar sentence about selling, but also included “sharing orally, in writing, or by electronic or other means a consumer’s personal information with a third party, whether for valuable consideration or for no consideration, for the third party’s commercial purposes.” Under the bill, “selling” without material compensation is not covered by the disclosure and delete rules.
Staff analysis of AB 375 summarizes definition of “personal information” and describes policy tradeoffs between tech and privacy advocates.
WHAT IS PERSONAL INFORMATION in the CCPA?
Here is how the California Consumer Privacy Act of 2018 defines personal information, according to an official bill analysis authored by California Assembly staffer Ronald Daylami and posted to the legislative website on June 27 (PDF — see Page 6):
“Personal information” means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. PI includes, but is not limited to, the following:
- Identifiers such as a real name, alias, postal address, unique personal identifier, online identifier Internet Protocol address, email address, account name, social security number, driver’s license number, passport number, or other similar identifiers.
- Categories of PI described in other existing California law.
- Characteristics of protected classifications under California or federal law.
- Commercial information, as specified.
- Biometric information.
- Internet or other electronic network activity information, including, but not limited to, browsing history, search history, and information regarding a consumer’s interaction with an internet website, application, or advertisement.
- Geolocation data.
- Audio, electronic, visual, thermal, olfactory, or similar information.
- Professional or employment-related information.
- Education information, defined as information that is not publicly available personally identifiable information as defined in the Family Educational Rights and Privacy Act.
- Inferences drawn from any of the information identified in this subdivision to create a profile about a consumer reflecting the consumer’s preferences, characteristics, psychological trends, predispositions, behavior, attitudes, intelligence, abilities, and aptitudes.
“Personal information” excludes publicly available information. For thesepu rposes, “publicly available” would mean information that is lawfully made available from federal, state, or local government records. “Publicly available” would not include biometric information collected by a business about a consumer without the consumer’s knowledge. Further, information would not be “publicly available” if that data is used for a purpose that is not compatible with the purpose for which the data is maintained and made available in the government records or for which it is publicly maintained. Lastly, “publicly available” would exclude consumer information that is de-identified or aggregate consumer information.
WHAT WERE THE “TRADEOFFS” BETWEEN CONSUMER AND PRIVACY ADVOCATE AND THE DATA-TECH INDUSTRIES?
Here is how Assembly staff analyst Ronald Daylami describes the tradeoffs in his written summary of a Committee on Privacy and Consumer Protection hearing on June 27:
- the removal of the initiative’s whistleblower provisions;
- a significant reduction of business’ liability exposure pursuant to consumer-initiated actions;
- a right to cure, when possible, both in the public and private enforcement provisions;
- a limitation of public enforcement to actions by the AG and explicit authorization to receive guidance from the AG on compliance as the single regulatory entity;
- a recognition of the ability of businesses to engage in various research-related activities, such for internal research and development, or other allowable forms of research with
- specified safeguards that would both ensure informed consent and better protect the consumers’ information used in the research;
- additional express exemptions, such as to exercise or defend legal claims, or for PI collected, processed, sold, or disclosed pursuant to certain federal laws, if the handling of the PI is in conflict with that those laws.
- language clarifying that businesses are not required to retain PI in situations where they would not ordinarily maintain that information (which would also undermine consumer protections);
- authorization to engage in certain financial incentive programs, as specified, such as free subscription services in exchange for advertising where the value to the consumer is based on the consumer’s data, as long as the financial incentive program is not unjust,
- unreasonable, coercive, or usurious and is directly related to the value provided to the consumer by the consumer’s data;
- a narrowing of the definition of “sell” to remove reference to situations that do not involve valuable consideration; and limit the obligation of businesses to reveal to consumers to whom the consumer’s PI was collected and shared with, or sold to or disclosed for a business purpose, to “categories” of third parties, as opposed to specific third parties.