The California Privacy Rights Act (“CPRA”) places significant power in the hands of the California Privacy Protection Agency (“CPPA” or “Agency”) to influence the future of privacy regulation in the United States, including—perhaps most importantly—the authority to issue regulations in twenty-two specific, enumerated areas to achieve the broad objective of “further[ing] the purposes of” the CPRA.

As to automated decision-making and profiling, the CPRA has granted the Agency the equivalent of a regulatory blank check. In this regard, the CPRA references profiling or automated decision-making a total of two times throughout the voluminous text of the statute: first, in defining the term “profiling,” and second, in the law’s broad rulemaking mandate:

Issuing regulations governing access and opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling and requiring businesses’ response to access requests to include meaningful information about the logic involved in such decision-making processes, as well as a description of the likely outcome of the process with respect to the consumer.

For this reason, the CPPA has focused a significant amount of its preliminary rulemaking activities on automated decision-making and profiling. This focus began in the fall of 2021 when profiling and automated decision-making were included as part of nine topics on which the Agency sought public comment. In late March, the CPPA hosted informational sessions—during which time the Agency discussed automated decision-making for the majority of an entire day, including cross-jurisdictional approaches to automated decision-making and profiling under the EU’s General Data Protection Regulation.

Just last week, the CPPA held stakeholders sessions (Agenda here) over the course of three days, during which it set aside three hours in the first half of the first day for stakeholders to comment on automated decision-making. Importantly, these comments—provided by a range of stakeholders—offer key insights into some of the more complex, challenging issues that businesses will face when adapting their privacy programs to comply with the new rules and restrictions that will be placed on automated decision-making under the CPRA beginning at the start of 2023.

The comments and positions of the individuals that spoke on the topic of automated decision-making varied widely. However, there were several common, key themes reiterated throughout the session that shine a light on concerns shared by various stakeholders, as well as the tug of war between their (and others’) competing interests. The stakeholder comments also highlighted the complexity of striking a balance between regulating automated decision-making technology and profiling in a privacy-protective manner while at the same time avoiding overly restrictive regulations that would hamper innovation. Many of the comments made fell under the following themes:

  • The Type of Automated Decision-Making Activities That Should Be Regulated: Many speakers highlighted the potentially significant, unintended ramifications of an overly broad scope for the term “automated decision-making technology,” which would result in producing little benefit to consumers while at the same time greatly hampering the operations of businesses across all sectors. For that reason, many speakers emphasized the need to limit the reach of automated decision-making regulation to: (1) fully automated decision-making technology; and (2) technology that produces legal or similarly significant effects, such as those bearing on a consumer’s employment or credit; and/or (3) high risk activities, sensitive data, and/or automated decision-making that constitutes profiling. In addition, several other speakers noted the need for a requirement that the term encompasses only those activities that involve the processing of personal information (which would seem to be inherent in the CPRA regardless).
  • Consumer Rights Relating to the Use of Automated Decision-Making Technology: Speakers also frequently highlighted the need for balance as it relates to consumers’ access rights regarding automated decision-making technology. On the one hand, as many speakers suggested, the CPRA should not impose requirements on businesses to disclose information to consumers on low-risk automated decision-making technology, such as spell check or spreadsheets. On the other, the CPPA was cautioned to avoid crafting regulations that afforded access rights that would require businesses to provide detailed descriptions of complex algorithms involved in automated decision-making, as doing so would fail to provide average consumers with “meaningful” information regarding the information and logic underlying automated processing. At the same time, the required disclosure of algorithms and similar sensitive business information would also likely conflict with the right of businesses to protect their trade secrets and similar types of information.
  • Consumer Opt-Out Rights Relating to Automated Decision-Making: Many speakers shared the common concern that the significant benefits offered by automated decision-making technology to consumers and businesses alike could be severely hampered by granting consumers overbroad opt-out rights as it relates to activities that fall under the definition of automated decision-making. At a minimum, several speakers suggested, regulations relating to automated decision-making should be tethered to the CPRA’s statutory rights of access and opt-outs.
  • Alignment with the GDPR and other Regulatory Schemes: Many stakeholders, including a representative of the Future of Privacy Forum, urged that the regulations should align with GDPR Article 22. Others pointed to the EU’s pending Digital Services Act, as well as the Artificial Intelligence Act, for other schemes with which the CPRA’s regulations should be consistent.

Conclusion

Following the CPPA’s May stakeholder sessions, the CPPA will begin the formal rulemaking process, but final Regulations are not anticipated to be issued until sometime in early 2023. Companies should monitor for developments in the area of CPPA rulemaking to ensure they are aware of any anticipated changes in the law, which will go into effect at the start of 2023. In addition, companies should immediately begin adapting their privacy programs for compliance not only with the CPRA but also with the Colorado, Connecticut, Virginia, and Utah laws that will also come online over the course of 2023 as well.

For more information on the stakeholder sessions, including other topics discussed, you can visit the CPPA’s events page here.

Check back often for more of SPB’s and CPW’s thought leadership on the CPRA and the other 2023 state privacy laws, as well as on AI and automated decision-making. For a further discussion of the CPPA’s approach to rulemaking on automated decision-making and profiling, you can view a recording of our recent webinar 2022 Developments and Trends Concerning Biometric Privacy and Artificial Intelligence. In addition, SPB Partners Kyle Fath and Kristin Bryan will take a deeper dive into this and related topics in our June 2 webinar hosted by the International Association of Privacy Professionals (IAPP). Registration for the IAPP webinar is available here (free for IAPP members).

Dark patterns are top of mind for regulators on both sides of the Atlantic. In the United States, federal and state regulators are targeting dark patterns as part of both their privacy and traditional consumer protection remits. Meanwhile, the European Data Protection Board (EDPB) is conducting a consultation on proposed Guidelines (Guidelines) for assessing and avoiding dark pattern practices that violate the EU General Data Protection Directive (GDPR) in the context of social media platforms. In practice, the Guidelines are likely to have broader application to other types of digital platforms as well. Continue Reading “Dark Patterns” Are Focus of Regulatory Scrutiny in the United States and Europe

As part of its continued preliminary rulemaking activities, the California Privacy Protection Agency (“CPPA”) will be holding stakeholder sessions Wednesday, May 4 through Friday, May 6 to provide an opportunity for stakeholders to weigh in on topics relevant to upcoming rulemaking. The Agenda for each of the sessions, which are slated to last an entire day, is available here. Continue Reading California Privacy Regulator to Hold Stakeholder Sessions First Week of May

Connecticut is gearing up to be the next state with a comprehensive privacy law. On April 28, 2022, the Connecticut General Assembly passed SB 6, “An Act Concerning Personal Data Privacy and Online Monitoring,” which is currently with the governor awaiting signature.  Of the state laws that have passed, SB 6 is most similar to the Colorado Privacy Act (“CPA”), Virginia Consumer Data Protection Act (“CDPA”), and Utah Consumer Privacy Act (“UCPA”). For example, under SB 6, the terms “controller,” “processor,” and “personal data” have similar definitions as under the CPA, CDPA, and UCPA. Continue Reading Connecticut General Assembly Passes Comprehensive Privacy Bill

On March 10, 2022, California Attorney General Rob Bonta (Attorney General) published the first official opinion interpreting the California Consumer Privacy Act (CCPA) and concluded that the CCPA’s right to know includes a business’ internally generated inferences about a consumer from either internal or external information sources.

Importantly, the opinion clarifies that inferences made from information that is otherwise exempt from the scope of the CCPA – such as publicly available information – are, in fact, personal information. Finally, the opinion weighs in on the tug of war between consumer privacy rights and businesses’ intellectual property and trade secret rights, definitively stating that trade secrets are completely protected from disclosure under the CCPA. These are important conclusions for businesses to consider in order to ensure CCPA compliance in the immediate term and as they ramp up for the implementation of the California Privacy Rights Act of 2020 (CPRA), which becomes fully operative on January 1, 2023, and substantially amends the CCPA.

Key Takeaways

  • In short, the Attorney General concluded that “internally generated inferences that a business holds about a consumer are personal information within the meaning of the CCPA, and must be disclosed to the consumer on request.” Opinion No. 20-303 (Opinion), p. 15. This is true even if the information off which the inferences are based is exempt from the CCPA when collected (e.g., publicly available information).
  • Arguably, though, businesses do not need to delete internally generated inferences in response to a consumer’s request to delete, even if based on personal information collected from the consumer.
  • Trade secrets are completely protected under the CCPA, but a business bears the burden of demonstrating that the withheld information is a trade secret. The inference itself might not be a trade secret and would have to be disclosed in response to a request to know; however, the algorithm that a company uses to derive its inferences may be a trade secret and, if so, would not have to be disclosed.
  • The CPRA will address this interplay between trade secret and consumer rights, whereby businesses will be required to disclose meaningful information about the logic involved in automated decision-making under the new concept of “profiling” and its related consumer rights, presenting a potential conflict between consumer rights and trade secret rights that may be addressed in upcoming rulemaking.
  • Colorado, Virginia and Utah’s omnibus privacy laws do not specifically reference inferences, but neither do they enumerate categories of personal data the way the CCPA does, and their definitions of personal data are broad enough to likely capture some types of consumer inferences. Under the Colorado Privacy Act, both the access and deletion rights apply to internally generated inferences to the extent they are personal data. Under the Virginia Consumer Data Protection Act, a consumer can cause deletion but not obtain a copy of personal data not directly collected from the consumer. Utah’s recently passed Consumer Privacy Act allows consumers to delete and obtain a copy of personal data only if the consumer previously provided it directly to the controller, and thus would not allow consumers to obtain a copy or delete internally generated inferences.

Check out the detailed analysis prepared by the team here.

The Utah Consumer Privacy Act (“UCPA”) was signed into law by Governor Spencer J. Cox yesterday. CPW has been tracking the UCPA’s progress throughout this legislative session.

Effective Date

December 31, 2023.

Applicability

In comparison to other state laws, the UCPA’s applicability thresholds are more stringent, requiring controllers or processors to meet three prongs:

  1. Do business in the state or targeting residents with products/services;
  2. Have annual revenue of $25 million or more; and
  3. Data collection, processing, or sale/revenue thresholds.

Practically, this will likely exempt smaller to mid-market organizations with limited revenue but substantial data collection, processing, and/or sale activities, unlike the other state laws.

In comparison, under the CCPA/CPRA, covered businesses could meet the revenue requirement or another threshold (e.g., sell/share the personal information of 50,000 or more consumers, OR derive 50% or more of annual revenues from selling consumers’ personal information).  The CDPA and CPA do not have revenue thresholds.

Enforcement

The UCPA establishes the Department of Commerce Division of Consumer Protection (“Division”), which will receive and investigate consumer complaints alleging violations of the UCPA.  Depending on the outcome of its investigation, the Division may refer certain cases to the Utah Attorney General (“AG”), who has exclusive authority to enforce the UCPA.  The AG may initiate an enforcement action based on the referral against a controller or process that violates the UCPA.

Enforcement Risk

Controllers or processors receiving a notice of violations have a 30-day cure period.  After, the AG may initiate an action against a controller or processor for failure to cure the noticed violations or if violations are ongoing.  The AG may seek up to $7,500 for each violation.

Rulemaking

The UCPA does not provide explicit authority for the AG to issue regulations. Interestingly, it requires the AG and the Division to compile a report by July 1, 2025 that evaluates liability and enforcement provisions and details summary of data protected (and not) by UCPA. Perhaps this report will spur the need for amendments and regulations, though it remains to be seen whether the legislature will act to empower the AG, Division, or other agency to carry out rulemaking in the meantime.

 

Advancements in artificial intelligence (AI) have led to a wide range of innovations in many aspects of our society and economy, including in a wide range of industry verticals such as healthcare, transportation, and cybersecurity. Recognizing that there are limitations and risks that must be addressed, AI has garnered the attention of regulators and legislators worldwide.

In 2020, Congress directed the National Institute of Standards and Technology (NIST) to develop an AI Risk Management Framework with the public and private sectors.  Last week, pursuant to its mandate, and following initial requests for information and workshops on AI it held in 2021, NIST released two documents relating to its broader efforts on AI. First, it published an initial draft of the AI Risk Management Framework on March 17. Public comments on the framework are open through April 29. In addition, the agency is holding a public workshop March 29-31.  Second, it updated a special publication, Towards a Standard for Identifying and Managing Bias in Artificial Intelligence. While it is unclear whether NIST’s efforts will lead to a broader consensus or federal legislation on AI, the Federal Trade Commission (FTC) and state legislatures are already focused on it in the immediate term.

As we have previously reported here on CPW (here), the FTC is focused on AI and has indicated consideration of promulgating AI-related regulations.  Though statements by Commissioner Wilson seem to have casted doubt on the Commission’s likelihood of issuing AI-focused regulations in the first half of this year, its recent settlement in the Weight Watchers case reinforces the agency’s commitment to consumer privacy and related issues and the effects that AI has on them.

AI and State Privacy Laws

AI is also a focus at the state level as well. Starting in 2023, AI, profiling, and other forms of automated decision-making will become regulated under the broad and sweeping privacy laws in California, Virginia, and Colorado, providing corresponding rights for consumers to opt-out of certain processing of their personal information by AI and similar processes.  We can expect to see AI and profiling concepts fleshed out substantially in regulations promulgated pursuant to the California Privacy Rights Act (CPRA). As of now, the CPRA is very light on details regarding profiling and AI, but seemingly will require businesses, in response to consumer requests to know/access “to include meaningful information about the logic involved in such decision-making processes” – in other words, information about the algorithms used in AI and automated decision-making. For now, we can expect to see regulations issued pursuant to the Colorado Privacy Act as well (in Virginia, it’s less clear as the Attorney General was not given rulemaking authority). Organizations should understand the requirements as to AI, profiling, and automated decision-making in these quickly approaching privacy regimes, and continue to pay attention as rulemaking in California and Colorado progresses.

 

In case you missed it, below are recent posts from Consumer Privacy World covering the latest developments on data privacy, security and innovation.  Please reach out to the authors if you are interested in additional information.

SEC Set to Consider Cybersecurity Proposal to Amend Regulations, Likely Affecting Public Companies | Consumer Privacy World

Privacy Continues to be Top of Mind Issue With President Biden’s State of the Union Address and Movement on FTC Nominee Today | Consumer Privacy World

UPDATED: Utah One Step Closer to a Consumer Privacy Bill | Consumer Privacy World

CPW on the Speaking Circuit in March: Warren to Speak at PrivSec China on China’s Data Privacy Law | Consumer Privacy World

Maryland Considering Biometrics Bill That Could Shift Compliance Landscape and Contains Private Right of Action | Consumer Privacy World

Georgia Considering Broad Privacy Bill With Private Right of Action and Liquidated Statutory Damages That Would Exceed Scope of California Law | Consumer Privacy World

CPW on the Speaking Circuit in March: Golding to Speak at 31st National HIPAA Summit | Consumer Privacy World

Utah One Step Closer to a Consumer Privacy Bill | Consumer Privacy World

Squire Patton Boggs (US) LLP and CPW Welcomes Privacy Pro David Oberly | Consumer Privacy World

ICO, CMA and Google Reach Agreement on Privacy Sandbox Proposals | Consumer Privacy World

The Metaverse Social and Economic Implications: A Do-Not-Miss CTO Circle Event | Consumer Privacy World

Federal Judge Refuses Second Time to Approve Class Action Settlement, Rejecting Plaintiffs “You Can Lead a Horse To Water” Explanation Upon Identifying Notice Deficiencies | Consumer Privacy World

Squire Patton Boggs Continues Growth of Acclaimed Data Privacy, Cybersecurity & Digital Assets Practice With Promotion of Kyle Fath and Litigator Kristin Bryan to Partner | Consumer Privacy World

President Biden to Nominate DC Circuit Judge Ketanji Brown Jackson to Supreme Court-What Impact Will This Have on Data Privacy and Cybersecurity Cases Going Forward? | Consumer Privacy World

Illinois Appellate Panel Ruling Findings Union Workers Biometric Claims Preempted by Labor Law and Subject to Binding Arbitration | Consumer Privacy World

Federal Court Dismisses California Cybersecurity Litigation Concerning Alleged Disclosure of Information in Website Hack | Consumer Privacy World

Early FTC Action in 2022 on Data Privacy, Facial Recognition and AI Less Likely Following Commissioner Remarks to U.S. Chamber of Commerce | Consumer Privacy World

Loyalty Program CCPA Compliance: Kyle Dull Talks to Law360 | Consumer Privacy World

Federal Court Gives Rare Refusal for Final Sign Off on Data Privacy Class Action Settlement, Faulting Low Take Rate and Excessive Fees | Consumer Privacy World

CCPA/CPRA Proposed Amendments Would Extend HR and B2B Data Exemptions, or Would They? | Consumer Privacy World

EDPB Coordinated Enforcement Action on Cloud under the CEF and the French CNIL’s 2022 Investigation Program | Consumer Privacy World

CPW is pleased to announce that today David Oberly joins Squire Patton Boggs (US) LLP’s globally-recognized Data Privacy, Cybersecurity & Digital Assets Practice from Blank Rome, where he played an instrumental role in launching the firm’s Biometric Privacy Practice.  As a recognized thought leader in the biometric privacy space, David serves as a go-to expert for companies that utilize biometrics in their operations—counseling clients on the full range of regulatory compliance obligations applicable today, as well as on managing potential legal exposure and liability risks. David also regularly develops organization-wide biometric privacy compliance programs in connection with all types of biometric technologies.

In addition, David also serves as the trusted privacy advisor to companies across a wide variety of industries, providing compliance, risk management, and product guidance on a broad assortment of privacy, security, and data protection issues that companies face in today’s highly-digital world. David has particular expertise and experience in both counseling/advising and developing compliance programs in connection with consumer privacy laws, including the CCPA, CPRA, CDPA, and CPA. In this capacity, David routinely assists clients in understanding how consumer privacy laws impact their organizational data handling and security practices and has helped numerous companies operationalize compliance with today’s growing web of consumer privacy regulation. David also regularly provides guidance on compliance with a wide range of other state and federal privacy laws, including the New York SHIELD Act, NYDFS Part 500 Cybersecurity Regulation, Florida Security of Communications Act (FSCA), GLBA, HIPAA, and FCRA, among others.

David has deep experience in security incident response matters—both in terms of assisting clients in incident response and crisis management following data breach events and in counseling clients on concerns regarding potential security incidents. David’s expertise extends to a wide range of security incidents, including cloud data breaches, malware credit card breaches, employee phishing breaches, social media account takeover events, ransomware, and inadvertent data disclosure events. David is also experienced in handling all aspects of the incident response process, including post-incident forensic and regulatory investigations, notifications to impacted individuals and privacy regulators, interacting with law enforcement and regulators, and implementing post-incident remediation plans.  David’s advisory work is informed by his significant experience in defending and litigating high-stakes, high-exposure biometric privacy class actions, particularly those brought under the Illinois Biometric Information Privacy Act (BIPA), as well as deep experience in defending other types of privacy and consumer protection class litigation.

Welcome, David!