The California Privacy Rights Act (“CPRA”) places significant power in the hands of the California Privacy Protection Agency (“CPPA” or “Agency”) to influence the future of privacy regulation in the United States, including—perhaps most importantly—the authority to issue regulations in twenty-two specific, enumerated areas to achieve the broad objective of “further[ing] the purposes of” the CPRA.

As to automated decision-making and profiling, the CPRA has granted the Agency the equivalent of a regulatory blank check. In this regard, the CPRA references profiling or automated decision-making a total of two times throughout the voluminous text of the statute: first, in defining the term “profiling,” and second, in the law’s broad rulemaking mandate:

Issuing regulations governing access and opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling and requiring businesses’ response to access requests to include meaningful information about the logic involved in such decision-making processes, as well as a description of the likely outcome of the process with respect to the consumer.

For this reason, the CPPA has focused a significant amount of its preliminary rulemaking activities on automated decision-making and profiling. This focus began in the fall of 2021 when profiling and automated decision-making were included as part of nine topics on which the Agency sought public comment. In late March, the CPPA hosted informational sessions—during which time the Agency discussed automated decision-making for the majority of an entire day, including cross-jurisdictional approaches to automated decision-making and profiling under the EU’s General Data Protection Regulation.

Just last week, the CPPA held stakeholders sessions (Agenda here) over the course of three days, during which it set aside three hours in the first half of the first day for stakeholders to comment on automated decision-making. Importantly, these comments—provided by a range of stakeholders—offer key insights into some of the more complex, challenging issues that businesses will face when adapting their privacy programs to comply with the new rules and restrictions that will be placed on automated decision-making under the CPRA beginning at the start of 2023.

The comments and positions of the individuals that spoke on the topic of automated decision-making varied widely. However, there were several common, key themes reiterated throughout the session that shine a light on concerns shared by various stakeholders, as well as the tug of war between their (and others’) competing interests. The stakeholder comments also highlighted the complexity of striking a balance between regulating automated decision-making technology and profiling in a privacy-protective manner while at the same time avoiding overly restrictive regulations that would hamper innovation. Many of the comments made fell under the following themes:

  • The Type of Automated Decision-Making Activities That Should Be Regulated: Many speakers highlighted the potentially significant, unintended ramifications of an overly broad scope for the term “automated decision-making technology,” which would result in producing little benefit to consumers while at the same time greatly hampering the operations of businesses across all sectors. For that reason, many speakers emphasized the need to limit the reach of automated decision-making regulation to: (1) fully automated decision-making technology; and (2) technology that produces legal or similarly significant effects, such as those bearing on a consumer’s employment or credit; and/or (3) high risk activities, sensitive data, and/or automated decision-making that constitutes profiling. In addition, several other speakers noted the need for a requirement that the term encompasses only those activities that involve the processing of personal information (which would seem to be inherent in the CPRA regardless).
  • Consumer Rights Relating to the Use of Automated Decision-Making Technology: Speakers also frequently highlighted the need for balance as it relates to consumers’ access rights regarding automated decision-making technology. On the one hand, as many speakers suggested, the CPRA should not impose requirements on businesses to disclose information to consumers on low-risk automated decision-making technology, such as spell check or spreadsheets. On the other, the CPPA was cautioned to avoid crafting regulations that afforded access rights that would require businesses to provide detailed descriptions of complex algorithms involved in automated decision-making, as doing so would fail to provide average consumers with “meaningful” information regarding the information and logic underlying automated processing. At the same time, the required disclosure of algorithms and similar sensitive business information would also likely conflict with the right of businesses to protect their trade secrets and similar types of information.
  • Consumer Opt-Out Rights Relating to Automated Decision-Making: Many speakers shared the common concern that the significant benefits offered by automated decision-making technology to consumers and businesses alike could be severely hampered by granting consumers overbroad opt-out rights as it relates to activities that fall under the definition of automated decision-making. At a minimum, several speakers suggested, regulations relating to automated decision-making should be tethered to the CPRA’s statutory rights of access and opt-outs.
  • Alignment with the GDPR and other Regulatory Schemes: Many stakeholders, including a representative of the Future of Privacy Forum, urged that the regulations should align with GDPR Article 22. Others pointed to the EU’s pending Digital Services Act, as well as the Artificial Intelligence Act, for other schemes with which the CPRA’s regulations should be consistent.

Conclusion

Following the CPPA’s May stakeholder sessions, the CPPA will begin the formal rulemaking process, but final Regulations are not anticipated to be issued until sometime in early 2023. Companies should monitor for developments in the area of CPPA rulemaking to ensure they are aware of any anticipated changes in the law, which will go into effect at the start of 2023. In addition, companies should immediately begin adapting their privacy programs for compliance not only with the CPRA but also with the Colorado, Connecticut, Virginia, and Utah laws that will also come online over the course of 2023 as well.

For more information on the stakeholder sessions, including other topics discussed, you can visit the CPPA’s events page here.

Check back often for more of SPB’s and CPW’s thought leadership on the CPRA and the other 2023 state privacy laws, as well as on AI and automated decision-making. For a further discussion of the CPPA’s approach to rulemaking on automated decision-making and profiling, you can view a recording of our recent webinar 2022 Developments and Trends Concerning Biometric Privacy and Artificial Intelligence. In addition, SPB Partners Kyle Fath and Kristin Bryan will take a deeper dive into this and related topics in our June 2 webinar hosted by the International Association of Privacy Professionals (IAPP). Registration for the IAPP webinar is available here (free for IAPP members).

In a new IAPP web conference on Thursday, June 2, 2022 at 11 a.m. EST, data privacy thought leaders Kyle Fath and Kristin Bryan will take a look at key developments and trends in the developing areas of artificial intelligence (AI) and biometrics. During the session, our subject matter specialists will discuss:

  • AI, biometrics, and privacy compliance – Restrictions on and obligations under forthcoming privacy laws in California, Virginia, Colorado and Utah, including with respect to profiling, automated decision-making and sensitive data.
  • AI and biometrics litigation overview – The current litigation landscape concerning AI and biometrics (including facial recognition).
  • Legislative and regulatory priorities – Pending and anticipated legislative and regulatory developments, both federal and state, as well as global.

The IAPP is the largest and most comprehensive global information privacy community and resource that helps define, promote and improve the privacy profession globally.

Click here to register.

Dark patterns are top of mind for regulators on both sides of the Atlantic. In the United States, federal and state regulators are targeting dark patterns as part of both their privacy and traditional consumer protection remits. Meanwhile, the European Data Protection Board (EDPB) is conducting a consultation on proposed Guidelines (Guidelines) for assessing and avoiding dark pattern practices that violate the EU General Data Protection Directive (GDPR) in the context of social media platforms. In practice, the Guidelines are likely to have broader application to other types of digital platforms as well. Continue Reading “Dark Patterns” Are Focus of Regulatory Scrutiny in the United States and Europe

As part of its continued preliminary rulemaking activities, the California Privacy Protection Agency (“CPPA”) will be holding stakeholder sessions Wednesday, May 4 through Friday, May 6 to provide an opportunity for stakeholders to weigh in on topics relevant to upcoming rulemaking. The Agenda for each of the sessions, which are slated to last an entire day, is available here. Continue Reading California Privacy Regulator to Hold Stakeholder Sessions First Week of May

Connecticut is gearing up to be the next state with a comprehensive privacy law. On April 28, 2022, the Connecticut General Assembly passed SB 6, “An Act Concerning Personal Data Privacy and Online Monitoring,” which is currently with the governor awaiting signature.  Of the state laws that have passed, SB 6 is most similar to the Colorado Privacy Act (“CPA”), Virginia Consumer Data Protection Act (“CDPA”), and Utah Consumer Privacy Act (“UCPA”). For example, under SB 6, the terms “controller,” “processor,” and “personal data” have similar definitions as under the CPA, CDPA, and UCPA. Continue Reading Connecticut General Assembly Passes Comprehensive Privacy Bill

As CPW previously covered, the Fifth Circuit Court of Appeals, in a published decision, affirmed dismissal of Plaintiffs’ Complaint in Allen v. Vertafore, 21-20404, Fifth Circuit Court of Appeals, March 11, 2022. In its Opinion, the Fifth Circuit agreed with the district court that Plaintiffs failed to plead a cognizable claim under the federal Driver’s Privacy Protection Act (“DPPA”), 18 USC § 2721, et seq, refusing to revive a putative class action where Plaintiffs demanded USD $69.9 billion in liquidated damages.

Allen concerned a data event Vertafore publicly disclosed in November 2020, which involved the unsecured online storage of Texas drivers’ license data for over 27.7 million individuals. The first three cases were filed in the District of Colorado, Northern District of Texas, and Southern District of Texas, each seeking to represent 27.7 million class members and seeking more than USD $69 billion in statutory liquidated damages under the DPPA in addition to damages on negligence claims, injunctive relief, and potential punitive damages.

Consistent with Fifth Circuit precedent, to state a claim for a violation of the DPPA, the complaint must adequately allege that (1) the defendant knowingly obtained, disclosed, or used personal information; (2) from a motor vehicle record; and (3) for a purpose not permitted. On this basis, the first-filed Allen complaint was dismissed as the district court held Plaintiffs failed to adequately allege that Vertafore knowingly disclosed personal information for a purpose not permitted by the DPPA.

Plaintiffs then filed an appeal to the Fifth Circuit. The Fifth Circuit, however, affirmed the district court’s dismissal.

In the wake of this impressive win for Vertafore and the SPB Team, Bloomberg Law reached out to CPW’s Rafael Langer-Osuna and Kristin Bryan to get their insight on the impact this ruling will have on DPPA litigation going forward for a recently published article.

Kristin Bryan was quoted in the article as saying, “[t]he Driver’s Privacy Protection Act, enacted in 1994, prohibits the disclosure of personal information without consent, with some exceptions. It was passed to safeguard people’s privacy and safety and to regulate the disclosure of personal information by state Departments of Motor Vehicles—not to penalize companies in the wake of a data event, as is the case here. To successfully bring claims under the statute, plaintiffs must allege a knowing disclosure. The Fifth Circuit rightly recognized that a purported mismanagement of information—such as storing driver’s license data on unprotected servers—doesn’t clear that bar.”

In the article, Rafael Langer-Osuna notably states that “[t]he law has been attractive to plaintiffs because of the potential for high fees. It provides for liquidated damages of at least [USD]$2,500 per violation. Plaintiffs have been making this reach for a long time. Now they’ll be forced to rely on statutes that actually relate to the data breach context.”

For the full scoop, click here to see the news article by Bloomberg Law.

We again want to congratulate the SPB Vertafore team for successfully defeating this high-stakes data privacy case and subsequently paving the way for future DPPA litigation to come. 

On Tuesday, April 5, CPW’s Alan Friel joined forces with Rebecca Perry, Director of Strategic Partnerships at Exterro, to share their expertise during the “Preparing for 2023 – Tools and Tips to Be Ready for New US Privacy Laws” webinar hosted by Global Data Review.

During this one-hour long virtual session, the duo discussed new requirements under California, Colorado and Virginia privacy laws, as well as the evolving enforcement positions by the California attorney general. Highlights from the webinar include:

  • New consumer rights and controller obligations
  • New contractual requirements for processors, service providers and contractors
  • How to meet new retention schedule and limitation requirements
  • Managing purpose limitations
  • How the California attorney general views third-party cookies

If you were not able to join the live discussion, a recording of the webinar can be found here.

This week members of the CPW team, including subject matter experts Kristin Bryan, Kyle Fath, David Oberly, offered insights into trends across the biometric privacy and artificial intelligence landscape.   They also addressed what has transpired in 2022 in this space and what may be on the horizon.  This included:

  • AI and privacy compliance – An overview of restrictions on and obligations with respect to AI, profiling and other automated decision-making processes under forthcoming privacy laws in California, Virginia, Colorado and Utah.
  • AI and biometrics litigation overview – An overview of the current litigation landscape concerning biometric data and AI, as well as related insights.
  • State legislative priorities – Approaches states are taking to the use of facial recognition technology.
  • Anticipated federal developments – Proposed federal legislation concerning biometrics, AI and other anticipated developments in 2022.

If you were not able to attend the live webinar, you can check out the recording here.   We welcome your questions and interest – for any follow up, do not hesitate to reach out to any member of our team directly.  And be sure to check out the CPW Team’s 2022 Q1 AI/Biometric Litigation Trends from last week, which is available here.

On March 10, 2022, California Attorney General Rob Bonta (Attorney General) published the first official opinion interpreting the California Consumer Privacy Act (CCPA) and concluded that the CCPA’s right to know includes a business’ internally generated inferences about a consumer from either internal or external information sources.

Importantly, the opinion clarifies that inferences made from information that is otherwise exempt from the scope of the CCPA – such as publicly available information – are, in fact, personal information. Finally, the opinion weighs in on the tug of war between consumer privacy rights and businesses’ intellectual property and trade secret rights, definitively stating that trade secrets are completely protected from disclosure under the CCPA. These are important conclusions for businesses to consider in order to ensure CCPA compliance in the immediate term and as they ramp up for the implementation of the California Privacy Rights Act of 2020 (CPRA), which becomes fully operative on January 1, 2023, and substantially amends the CCPA.

Key Takeaways

  • In short, the Attorney General concluded that “internally generated inferences that a business holds about a consumer are personal information within the meaning of the CCPA, and must be disclosed to the consumer on request.” Opinion No. 20-303 (Opinion), p. 15. This is true even if the information off which the inferences are based is exempt from the CCPA when collected (e.g., publicly available information).
  • Arguably, though, businesses do not need to delete internally generated inferences in response to a consumer’s request to delete, even if based on personal information collected from the consumer.
  • Trade secrets are completely protected under the CCPA, but a business bears the burden of demonstrating that the withheld information is a trade secret. The inference itself might not be a trade secret and would have to be disclosed in response to a request to know; however, the algorithm that a company uses to derive its inferences may be a trade secret and, if so, would not have to be disclosed.
  • The CPRA will address this interplay between trade secret and consumer rights, whereby businesses will be required to disclose meaningful information about the logic involved in automated decision-making under the new concept of “profiling” and its related consumer rights, presenting a potential conflict between consumer rights and trade secret rights that may be addressed in upcoming rulemaking.
  • Colorado, Virginia and Utah’s omnibus privacy laws do not specifically reference inferences, but neither do they enumerate categories of personal data the way the CCPA does, and their definitions of personal data are broad enough to likely capture some types of consumer inferences. Under the Colorado Privacy Act, both the access and deletion rights apply to internally generated inferences to the extent they are personal data. Under the Virginia Consumer Data Protection Act, a consumer can cause deletion but not obtain a copy of personal data not directly collected from the consumer. Utah’s recently passed Consumer Privacy Act allows consumers to delete and obtain a copy of personal data only if the consumer previously provided it directly to the controller, and thus would not allow consumers to obtain a copy or delete internally generated inferences.

Check out the detailed analysis prepared by the team here.