The California Privacy Rights Act (“CPRA”) places significant power in the hands of the California Privacy Protection Agency (“CPPA” or “Agency”) to influence the future of privacy regulation in the United States, including—perhaps most importantly—the authority to issue regulations in twenty-two specific, enumerated areas to achieve the broad objective of “further[ing] the purposes of” the CPRA.

As to automated decision-making and profiling, the CPRA has granted the Agency the equivalent of a regulatory blank check. In this regard, the CPRA references profiling or automated decision-making a total of two times throughout the voluminous text of the statute: first, in defining the term “profiling,” and second, in the law’s broad rulemaking mandate:

Issuing regulations governing access and opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling and requiring businesses’ response to access requests to include meaningful information about the logic involved in such decision-making processes, as well as a description of the likely outcome of the process with respect to the consumer.

For this reason, the CPPA has focused a significant amount of its preliminary rulemaking activities on automated decision-making and profiling. This focus began in the fall of 2021 when profiling and automated decision-making were included as part of nine topics on which the Agency sought public comment. In late March, the CPPA hosted informational sessions—during which time the Agency discussed automated decision-making for the majority of an entire day, including cross-jurisdictional approaches to automated decision-making and profiling under the EU’s General Data Protection Regulation.

Just last week, the CPPA held stakeholders sessions (Agenda here) over the course of three days, during which it set aside three hours in the first half of the first day for stakeholders to comment on automated decision-making. Importantly, these comments—provided by a range of stakeholders—offer key insights into some of the more complex, challenging issues that businesses will face when adapting their privacy programs to comply with the new rules and restrictions that will be placed on automated decision-making under the CPRA beginning at the start of 2023.

The comments and positions of the individuals that spoke on the topic of automated decision-making varied widely. However, there were several common, key themes reiterated throughout the session that shine a light on concerns shared by various stakeholders, as well as the tug of war between their (and others’) competing interests. The stakeholder comments also highlighted the complexity of striking a balance between regulating automated decision-making technology and profiling in a privacy-protective manner while at the same time avoiding overly restrictive regulations that would hamper innovation. Many of the comments made fell under the following themes:

  • The Type of Automated Decision-Making Activities That Should Be Regulated: Many speakers highlighted the potentially significant, unintended ramifications of an overly broad scope for the term “automated decision-making technology,” which would result in producing little benefit to consumers while at the same time greatly hampering the operations of businesses across all sectors. For that reason, many speakers emphasized the need to limit the reach of automated decision-making regulation to: (1) fully automated decision-making technology; and (2) technology that produces legal or similarly significant effects, such as those bearing on a consumer’s employment or credit; and/or (3) high risk activities, sensitive data, and/or automated decision-making that constitutes profiling. In addition, several other speakers noted the need for a requirement that the term encompasses only those activities that involve the processing of personal information (which would seem to be inherent in the CPRA regardless).
  • Consumer Rights Relating to the Use of Automated Decision-Making Technology: Speakers also frequently highlighted the need for balance as it relates to consumers’ access rights regarding automated decision-making technology. On the one hand, as many speakers suggested, the CPRA should not impose requirements on businesses to disclose information to consumers on low-risk automated decision-making technology, such as spell check or spreadsheets. On the other, the CPPA was cautioned to avoid crafting regulations that afforded access rights that would require businesses to provide detailed descriptions of complex algorithms involved in automated decision-making, as doing so would fail to provide average consumers with “meaningful” information regarding the information and logic underlying automated processing. At the same time, the required disclosure of algorithms and similar sensitive business information would also likely conflict with the right of businesses to protect their trade secrets and similar types of information.
  • Consumer Opt-Out Rights Relating to Automated Decision-Making: Many speakers shared the common concern that the significant benefits offered by automated decision-making technology to consumers and businesses alike could be severely hampered by granting consumers overbroad opt-out rights as it relates to activities that fall under the definition of automated decision-making. At a minimum, several speakers suggested, regulations relating to automated decision-making should be tethered to the CPRA’s statutory rights of access and opt-outs.
  • Alignment with the GDPR and other Regulatory Schemes: Many stakeholders, including a representative of the Future of Privacy Forum, urged that the regulations should align with GDPR Article 22. Others pointed to the EU’s pending Digital Services Act, as well as the Artificial Intelligence Act, for other schemes with which the CPRA’s regulations should be consistent.

Conclusion

Following the CPPA’s May stakeholder sessions, the CPPA will begin the formal rulemaking process, but final Regulations are not anticipated to be issued until sometime in early 2023. Companies should monitor for developments in the area of CPPA rulemaking to ensure they are aware of any anticipated changes in the law, which will go into effect at the start of 2023. In addition, companies should immediately begin adapting their privacy programs for compliance not only with the CPRA but also with the Colorado, Connecticut, Virginia, and Utah laws that will also come online over the course of 2023 as well.

For more information on the stakeholder sessions, including other topics discussed, you can visit the CPPA’s events page here.

Check back often for more of SPB’s and CPW’s thought leadership on the CPRA and the other 2023 state privacy laws, as well as on AI and automated decision-making. For a further discussion of the CPPA’s approach to rulemaking on automated decision-making and profiling, you can view a recording of our recent webinar 2022 Developments and Trends Concerning Biometric Privacy and Artificial Intelligence. In addition, SPB Partners Kyle Fath and Kristin Bryan will take a deeper dive into this and related topics in our June 2 webinar hosted by the International Association of Privacy Professionals (IAPP). Registration for the IAPP webinar is available here (free for IAPP members).

Readers of CPW know that our very own Lydia de la Torre has been selected to be an inaugural board member of the new California Privacy Protection Agency.   Listen to what Lydia and Alan Friel, Deputy Chair of SPB’s Data Privacy group have to say in a must-listen to podcast.  They discuss the history of privacy policy, the growing influence of European privacy principles, and the new privacy laws we are seeing, or can expect, at the state and federal levels here in the United States.  Absolutely essential stuff for anyone working in an industry impacted by this growing body of law.  Listen to it at Tech Freedom here.

And for more on all developments data privacy related, stay tuned.  CPW will keep you in the loop.

We congratulate our friend and colleague Lydia de la Torre on her appointment to the inaugural board for the California Privacy Protection Agency.  “Californians deserve to have their data protected and the individuals appointed today will bring their expertise in technology, privacy and consumer rights to advance that goal,” said Governor Newsom. “These appointees [including Lydia] represent a new day in online consumer protection and business accountability.”

In 2018, California became the first state in the U.S. to equip consumers with new privacy tools and new privacy rights under the California Consumer Privacy Act. On November 3, 2020, California voters approved Proposition 24, the California Privacy Rights Act (CPRA), which created the California Privacy Protection Agency. Enforcement of the CPRA will begin in 2023.  The California Privacy Protection Agency will have full administrative power, authority, and jurisdiction to implement and enforce the California Consumer Privacy Act and the California Privacy Rights Act. The board of the CPPA will appoint the agency’s executive director, officers, counsel and employees. The agency may bring enforcement actions related to the CCPA or CPRA before an administrative law judge. The Attorney General will retain civil enforcement authority over the CCPA and the CPRA.

“The California Privacy Protection Agency marks a historic new chapter in data privacy by establishing the first agency in the country dedicated to protecting forty million Californians’ fundamental privacy rights,” said Attorney General Becerra. “The CPPA Board will help California residents understand and control their data privacy while holding online businesses accountable.”

“The chance to serve on the Board of the new California Privacy Protection Agency is a great opportunity for Lydia, and one for which she is exceptionally well suited given her diverse background and talents.  She has uniquely balanced an academic and private practice career, and public service is a natural next step for her” said Alan Friel, Deputy Chair of Squire Patton Boggs’ Global Data Privacy & Cybersecurity Practice.  “We could not be happier for her and commend Senator Atkins on the selection of such a qualified individual.  While we are sorry to see Lydia go, her selection continues a long tradition of public service by our attorneys, which our firm fully embraces.”

As readers of CPW already know, in a development that will bring dramatic changes to the California data privacy realm, on November 3, 2020, a majority of Californians voted to approve a new ballot initiative – Proposition 24, or the “California Privacy Rights Act of 2020” (“CPRA”).  You can read the fantastic analysis prepared by CPW’s Lydia de la Torre, Glenn A. Brown, Elliot Golding and Ann J. LaFrance here.

Well folks, one of the main changes brought about by the California Privacy Rights Act is the establishment of the California Privacy Protection Agency (“CPPA”) as an “independent watchdog” whose mission is both to “vigorously enforce” the CPRA and “ensure that businesses and consumers are well‐informed about their rights and obligations.”  Following up on that initial piece, Lydia de la Torre and Glenn A. Brown prepared an incredible, must read analysis as to how, with passage of the CPRA, “the CPPA is set to become a key privacy regulator not only in California, but across the U.S. and the globe”.  Check it out here.

 

The California Privacy Protection Agency (“CPPA”) will host its next public meeting on Thursday, May 26, 2022 at 11AM PT. Members of the public may attend in person or virtually by following these instructions. CPPA Director Ashkan Soltani will provide an update on the CPPA’s hiring, budget, and rulemaking activities.  Importantly, subcommittees will provide more information on the course of action for the upcoming rulemaking process as well as information regarding the anticipated rulemaking draft.

In February, the CPPA expressed its strategy to host informational preliminary hearings in order to ensure that the rules they adopt adequately address the most prevalent issues in consumer privacy, and anticipated that the rulemaking process, including formal period public hearings, would commence in the third quarter and continue into the fourth quarter of 2022. Earlier this month, the CPPA held a pre-rulemaking stakeholder session during which it heard public comments on automated decision-making, with most comments focusing on: (1) the type of automated decision-making activities that should be regulated; (2) consumer rights relating to the use of automated decision-making technology; (3) consumer opt-out rights relating to automated decision-making; and (4) alignment with the General Data Protection Regulation and other regulatory schemes.

Although final Regulations are not anticipated until sometime in early 2023, the California Privacy Rights Act amendments to the California Privacy Protection Act (“CCPA”) will go into effect in January 2023. Businesses should therefore monitor CPPA rulemaking activities to ensure they are aware of how the lead CCPA enforcement agency interprets the CCPA’s requirements, and to glean insight into the agency’s potential enforcement priorities

As part of its continued preliminary rulemaking activities, the California Privacy Protection Agency (“CPPA”) will be holding stakeholder sessions Wednesday, May 4 through Friday, May 6 to provide an opportunity for stakeholders to weigh in on topics relevant to upcoming rulemaking. The Agenda for each of the sessions, which are slated to last an entire day, is available here. Continue Reading California Privacy Regulator to Hold Stakeholder Sessions First Week of May

As previewed in a recent post, the California Privacy Protection Agency (“CPPA”) held a public meeting on Thursday, February 17. Notably, the CPPA Board (“the Board”) outlined its authority, explained the authority of the CPPA’s Executive Director, and laid out its nearly year-long going forward plan for rulemaking.  The California Privacy Rights Act (“CPRA”), which substantially amends the existing California Consumer Privacy Act (“CCPA”), becomes effective January 1, 2023.  The CPRA tasked the CPPA, the first data protection authority in the U.S., to fill in a lot of the CPRA’s details through rulemaking by July 1, 2022.  However, based on the timeline presented, we will be lucky to have regulations by the time the CPRA goes into effect.  In its most optimistic estimate the CPPA is shooting for the middle of the 4th Quarter for final regulations.

The meeting initially began with an informational presentation, which delineated the distinct roles of the CPPA board members and the Executive Director of the California Privacy Protection Agency. The Board members’ chief responsibility is rulemaking. The Board members also are tasked with hiring the CPPA’s Executive Director (which it has already done), setting policy, adopting regulations, overseeing the Executive Director, and resolving enforcement actions. Whereas, the Executive Director is responsible for implementing policy, directing and managing the day-to-day operations of the agency, and representing the Board in the media and public. Both the Board members and the Executive Director work together in drafting agendas and strategic plans.

The first phase of rulemaking, which is currently underway, involves the Board members defining the problem, describing objectives and brainstorming solutions, and listing costs and benefits to potential rules. Next, specific Board committees will draft the proposed regulations, drawing on support from staff and legal counsel. The Board will then approve the language that will become the proposed regulations. The next phase is the staff production phase whereby staff prepares the rulemaking file for filing with the Office of Administrative Law (“OAL”), which is tasked with making sure the proposed regulations are consistent with the underlying statute and the rulemaking authority it grants the CPPA. Upon notice publication, a 45 day public comment period begins. The Board may hold a hearing during this time, but is required to do so only if a hearing is requested.  Any interested person can submit a written request for a hearing to be held, so long as the written request is submitted at least 15 days before the close of the written comment period. After public comment period concludes, the Board must address all adverse comments it received from the public and may notice changes, in which case a new 15 day public comment period is required if changes are material.  This process may be repeated.  Once the CPPA finalizes the proposed text it submits it along with a final rule package to the OAL for review.  More on the CPPA’s rulemaking process is available here.

As to timing, it was announced that preliminary and informational proceedings that will take place on unspecified dates in March and April 2022.  These hearings will precede the formal rule making process in order to help the Board identify issues to consider. The first set of instructive hearings will occur in mid to late March, and will invite comments from academics and experts who will inform questions and issues relating to topics the Board will explore in rulemaking. The second set of hearings will be for the public. The goal of these second hearings is to receive further information from stakeholders on the topics included in the Board’s rulemaking. Information for stakeholders to sign up and participate will be provided at a later date.

In April, directly following these preliminary hearings and drawing upon the information received from these proceedings, the Board will begin engaging in the formal rulemaking process (as described above).  The CPPA anticipates that the rulemaking process, including any formal period public hearings, would commence in the third quarter and continue into the fourth quarter of 2022.  The CPPA acknowledged that this timeline does not comport with the CPPA’s obligation to promulgate regulations by July 1, 2022 as mandated under the CPRA. Regardless, Mr. Solani, along with the Board members, emphasized the importance of the informational preliminary hearings in order to ensure that the rules they adopt adequately address the most prevalent issues in consumer privacy.

Beyond rulemaking, the CPPA also reported on its efforts to build out the agency.  Executive Director Soltani characterized the process as “building the car while driving it.” In addition to Mr. Soltani, the CPPA has hired additional support staff, two Associate Governmental Program Analysts, a law clerk, and is drawing upon staff resources from the Department of Justice. Mr. Soltani is leading the charge in recruiting for many more positions, including a General Counsel (which was discussed in closed session). The agency’s budget, already a part of the governor’s overall budget, will be presented for Board approval at the March 02, 2022 meeting.

So slow to get going, the CPPA is picking up momentum and rulemaking is finally moving forward.  SBP is working with clients, trade organizations, and bar associations to help them participate in the informal and formal rulemaking process.  For more information contact your SBP relationship partner or the authors.

 

 

Privacy regulators in California and Colorado recently made announcements regarding rulemaking for their respective state privacy laws. Last week, the California Privacy Protection Agency (“CPPA”) announced that it will hold its next public meeting this Thursday, February 17, during which it will discuss updates on the rulemaking process, including a timeline. On January 28, Colorado Attorney General Phil Weiser publicly announced the intent of the Colorado Office of the Attorney General (“COAG”) to carry out rulemaking activities to implement the Colorado Privacy Act (“CPA”), providing an indication of focus areas and a rough timeline. We discuss each of these developments in further detail below. Continue Reading California and Colorado Privacy Regulators Provide Updates on Rulemaking

On December 9, 2021, Alan Friel, Co-Chair of the SPB Global Data Privacy, Cybersecurity & Digital Assets Practice, led a fireside chat between U.S. Congresswoman Suzan DelBene and Alastair Mactaggart, as part of the session on Privacy, Security, Data Protection and Trust at the International Institute of Communications’ (“IIC”) Washington DC Telecommunications & Media Forum (“TMF”).  A recording of the discussion is available here.

Congresswoman DelBene serves as the Vice Chair of the powerful House Ways and Means Committee, is the co-chair of the Women’s High-Tech Coalition, and has introduced a federal consumer privacy legislation, the Information Transparency & Personal Data Control Act (“H.R. 1816”).  Mr. Mactaggart is the force behind California’s privacy laws and is the Board Chair and Founder of the Californians for Consumer Privacy, the organization that sponsored Proposition 24 (the California Privacy Rights Act or the “CPRA”) and the California Consumer Privacy Act of 2018 (“CCPA”).

The panelists discussed recently enacted U.S. state privacy laws and Congresswoman DelBene’s privacy bill, H.R. 1816, which was referred to the Subcommittee on Consumer Protection and Commerce in March 2021.  While the two policymakers agreed on the importance of consumer privacy legislation, their points of view on what that should mean for consumers and businesses diverged, and a spirited debate ensued.  Highlights are as follows:

A National Privacy Standard?

The panelists agreed it would be valuable to have a national privacy standard for safeguarding consumers’ personal data.  Congresswoman DelBene explained that a national privacy standard would:

  • curtail consumer confusion by making it so that consumers’ privacy rights do not change as much as they currently do when consumers travel from state to state;
  • alleviate the burden on businesses, especially small businesses, who may have to use considerable resources to comply with the requirements of each state privacy law; and
  • help to establish the U.S. as a key player in shaping global privacy policy—the Congressperson expressed that it is challenging for the U.S. to weigh in on international privacy issues when we lack a unified national standard.

Mr. Mactaggart agreed, explaining that a national privacy standard would grant privacy protections to people around the country.  However, he raised that H. R. 1816 in its current form would preempt state privacy laws by prohibiting states from adopting, enforcing (or continuing to enforce) laws and regulations related to data privacy, with exceptions.  Mr. Mactaggart recommended that a national privacy standard “should be a floor, not a ceiling,” and should not preempt stricter, non-conflicting state laws so states have an opportunity to strengthen privacy protections to meet the needs of their constituents.  He pointed to the Health Insurance Portability and Accountability Act (“HIPAA”) and Sarbanes-Oxley Act of 2002 as examples of federal laws that have created legal baselines by establishing minimum consumer protection requirements while also allowing states to strengthen protections for their constituents.

Transparency and Enabling Choice Regarding Use

H.R. 1816, as currently drafted, does not include an express right of access (other than with respect to sensitive information), transportable copies, or rights of correction or deletion.  The Congressperson explained that her intent was to propose a bill focused on fundamental policy and consumer rights that sets a solid foundation on which federal legislators can continue building.  Mr. Mactaggart expressed that although he understands the Congressperson’s goal, the effect of H.R. 1816 (which, in its current form, preempts state laws) would be to deprive consumers in states with existing privacy laws (e.g., California) of  rights they currently enjoy.  For example, according to Mr. Mactaggart, passing H.R. 1816 as currently drafted would deprive Californians of their rights to see, delete, or correct their information, among other things.  He recommended that a national privacy standard not remove existing privacy protections granted to consumers under state laws.

Scope of Rulemaking

The panelists agreed that an independent agency should be granted rulemaking authority. H.R. 1816, if passed, would grant rulemaking authority to the Federal Trade Commission (FTC) for privacy issues.  In California, the California Privacy Protection Agency (CPPA) has rulemaking authority for privacy.

Enforcement Authorities and Penalties for Non-Compliance

The panelists agreed that the FTC is the most qualified federal agency to lead privacy enforcement.  H.R. 1816, if passed, would be enforceable by both the FTC, a federal agency that has experience and expertise to lead meaningful privacy enforcement, and state attorneys general (but only if the FTC has not acted).  In California, the CPPA has administrative enforcement authority to enforce the CCPA/CPRA.

Private Right of Action

The panelists agreed that granting a private right of action creates challenges for covered businesses.  The Congressperson explained that H.R. 1816 does not have a private right of action because the threat of litigation can be very costly, especially for small businesses.  Mr. Mactaggart agreed and clarified that although there is a private right of action under the CCPA, the right is limited to a specific subset of personal information, and only for instances where a business is negligent in its data security practices.

Public Policy Balance Between Transparency and Choice in Digital Advertising

The panelists agreed that advertising is an important tool in commerce, but that it should be balanced with consumer protection considerations.

  • Mactaggart advised that privacy laws should contemplate including a distinction between contextual advertising and behavioral advertising, which he believes to be a more invasive form of advertising.
  • The Congressperson added that consumers should have the ability to opt-in and opt-out of information sharing depending on the context of their relationship and interaction with a business, and consumers should be provided with tools to help them understand their privacy rights, such as privacy notices that are easy to understand.

Sensitive Personal Information

The panelists agreed that certain types of information are more sensitive, and therefore, should be subjected to a heightened protection standard.

  • The Congressperson explained that companies should be required to obtain affirmative express consent before they can collect and share sensitive personal information (e.g., financial information, health or genetic information, information about children, citizenship or immigration status, gender, religious beliefs, etc.).
  • Mactaggart added that in California, the new category of “sensitive information” was added to balance giving consumers meaningful privacy rights with the need to enable businesses to utilize data to provide services to consumers.

Where to go from here?

Interestingly, the Congressperson expressed an openness to learn more and noted that her bill was merely a first draft to get the legislative process moving and welcomed input from stakeholders.  Mr. Mactaggart offered to sit down with her staff.  Where this will go next is unclear, but it appears that the discussion will continue.  A recording of the discussion is available here.  The IIC/TMF also covered international privacy issues.  A blog post on that is available here.

2021 has been a monumental year in many ways, and consumer financial privacy litigation and enforcement was no exception.  In the executive branch, the Biden Administration focused on strengthening individual privacy protections and limiting the disclosure of sensitive data.  Meanwhile, the Supreme Court’s decision in TransUnion LLC v. Ramirez continues to have a long-lasting impact in the privacy class action sphere.  Read on to hear about some of the biggest changes in financial privacy in 2021, and what it means for individuals, businesses and litigants in the new year.

TransUnion LLC v. Ramirez Limits Article III Standing in FCRA Class Actions

The Supreme Court dramatically limited the availability of Article III standing for financial privacy litigations in TransUnion LLC v. Ramirez, 141 S. Ct. 2190 (2021).  In Ramirez, a putative class of individuals whose credit reports contained mistaken terrorist designations sued TransUnion under the Fair Credit Reporting Act (“FRCA”).  Out of 8,185 class members, only 1,853 had misleading credit files provided to third-party businesses by TransUnion.  For the remaining 6,332 members, TransUnion maintained erroneous files but did not disseminate them to third-parties.  The Supreme Court held that class members whose credit files TransUnion provided to third-party businesses suffered a concrete harm akin to the common law tort of defamation, conferring Article III standing.  According to the Court, however, the remaining class members whose files were not released did not suffer a concrete harm and thus lacked standing.

In considering what constitutes an “injury in fact” under Article III, the Supreme Court held that “[o]nly plaintiffs concretely harmed by a defendant’s statutory violation have Article III standing to seek damages against the private defendant in federal court.”  The Court found that “Article III standing requires a concrete injury even in the context of a statutory violation.”  It is not the case, the Court clarified, that “a plaintiff automatically satisfies the injury-in-fact requirement whenever a statute grants a person a statutory right and purports to authorize that person to sue to vindicate that right.” (emphasis supplied).  The Court in Ramirez also held that in a class action for damages, class members must have Article III standing to recover.  The Court further held that a mere risk of future harm is not a concrete harm in a suit for damages.

What Are The Other Effects of Ramirez?

How else has Ramirez impacted financial privacy litigation?

First, some courts suggest that Ramirez’s application is limited earlier in the litigation process.  The court in In re Blackbaud, Inc., Customer Data Breach Litigation, 2021 U.S. Dist. LEXIS 123355 (D.S.C. July 1, 2021), considering a motion to dismiss, noted that Ramirez would be distinguishable for having a jury verdict.   Christian Labor Association v. City of Duluth, 2021 U.S. Dist. LEXIS 124289 (D. Minn. July 2, 2021), also suggested Ramirez’s applicability may be limited at the motion to dismiss stage.  However, numerous courts have applied Ramirez on a motion to dismiss.  This ambiguity in the procedural application of Ramirez is one to watch, especially when it comes to class certification.  Indeed, while the Court clarified that all class members seeking damages must establish standing, it expressly left open the question of whether every class member must demonstrate standing before a court certifies a class – an issue that lower courts have been grappling with in the wake of the Ramirez decision.

Second, the Ramirez decision raised concerns that states courts would be flooded with class actions—a “pyrrhic victory,” as Justice Clarence Thomas noted in his dissent.  So far, several courts have remanded putative financial privacy class actions to state courts.  In Lagrisola v. North American Financial Corp., 2021 U.S. Dist. LEXIS 192140 (S.D. Cal. Oct. 5, 2021), a federal court remanded a putative class action alleging violations of California law, and in Winters v. Douglas Emmett, Inc., 2021 U.S. Dist. LEXIS 124495 (C.D. Cal. July 2, 2021), the federal court remanded a putative FRCA class action.  Keep an eye on federal dockets in 2022 to see if these remands signal a growing trend, particularly in the Ninth Circuit.

Furthermore, some courts have attempted to contain Ramirez to defamation-adjacent actions.  For example, the court in Mastel v. Miniclip SA, 2021 U.S. Dist. LEXIS 132401 (E.D. Cal. July 15, 2021), found an injury in fact akin to invasion of privacy, not defamation, so Ramirez didn’t apply.  Similarly, the court in Lupia v. Medicredit, Inc., 8 F.4th 1184 (10th Cir. 2021), permitted a FDCPA claim to proceed, finding an injury in fact similar to intrusion upon seclusion.  In contrast, some courts have denied standing in cases where the defendant failed to disseminate private information, analogizing to defamation.  As a result, we may see a trend of plaintiffs arguing that their underlying harm resembles a tort other than defamation to uphold Article III standing.

On a related note, while commentators worried that Ramirez would preclude data breach litigations (including cases involved the alleged disclosure of personal financial information) from being brought in federal courts, such concerns have not yet materialized.  The courts in Blackbaud and Cotter v. Checkers Drive-In Restaurants, Inc., 2021 U.S. Dist. LEXIS 160592 (M.D. Fla. Aug. 25, 2021), distinguished Ramirez on procedural grounds.  Meanwhile, some courts have indicated that an impending injury or substantial risk could suffice for injury in fact in data breach litigation.  The court in Griffey v. Magellan Health Inc., 20210 U.S. Dist. LEXIS 184591 (D. Az. Sep. 27, 2021), found that plaintiffs alleged risks of future harm that were “certainly impending” and thus had standing.  All in all, however, pleading a data incident without something more probably does not survive a motion to dismiss.  That’s what happened in Legg v. Leaders Life Ins. Co., 2021 U.S. Dist. LEXIS 232833 (W.D. Okla. Dec. 6, 2021), where plaintiffs’ allegations of general risks of harm did not suffice.

Eleventh Circuit to Address Article III Standing in Wake of Ramirez After Whiplash in Hunstein v. Preferred Collection and Management Services, Inc.

In April, the Eleventh Circuit held in Hunstein v. Preferred Collection and Management Services, Inc., 994 F.3d 1341 (11th Cir. 2021), that the transmittal of a debtor’s personal information to a third-party mailing service violated section 1692c(b) of the Fair Debt Collection Practices Act (“FDCPA”).  In Hunstein I, Plaintiff incurred a hospital debt resulting from his son’s medical treatment.  The hospital assigned the debt to a debt collector, who hired a commercial mail vendor, transmitting personal information about Plaintiff along the way.  The Eleventh Circuit held that Plaintiff had suffered a concrete statutory injury sufficient for Article III standing, even though he had not suffered a “tangible harm” or even a “risk of real harm.”

In October, following Ramirez, the Eleventh Circuit vacated its opinion in Hunstein I but doubled-down on its original holdings.  The Eleventh Circuit held that the plaintiff suffered an intangible but concrete injury, analogizing the disclosure of his personal information to the common law tort of public disclosure of private facts.  Shortly thereafter in November, the Eleventh Circuit once again vacated Hunstein II and ordered a rehearing en banc, which has yet to occur.

In the meantime, the impact of Hunstein remains unclear.  Hunstein only binds courts within the Eleventh Circuit—but that doesn’t mean that other courts don’t take note of how the Eleventh Circuit subsequently rules.

For example, in Keller v. Northstar Locations Services, 2021 U.S. Dist. LEXIS 157820 (N.D. Ill. Aug. 20, 2021), and Thomas v. Unifin, Inc., 2021 U.S. Dist. LEXIS 157814 (N.D. Ill. Aug. 20, 2021), the Northern District of Illinois denied motions to remand individual FDCPA actions, arguing that disclosing information about debt to unauthorized third parties resembles invasion of privacy torts.  However, the Eastern District of New York dismissed six mailing vendor class actions in In re FDCPA Vendor Cases, 2021 U.S. Dist. LEXIS 139848 (E.D.N.Y. July 23, 2021), rejecting Hunstein and finding no injury in fact.

Other Financial Privacy Litigation Trends

More broadly, the number of consumer financial privacy cases filed in 2021 continued a year over year increase.  For example, according to Lex Machina and LexisNexis statistics, the number of FCRA litigations nearly tripled over the last decade with the number of filings continuing to rise compared to 2020.  Litigation under the Telephone Consumer Protection Act (“TCPA”) also remained at a high level.

One trend in FCRA litigation is a rising number of claims brought against employers in the background check context.  As shown by some recent cases, many prospective employers are not aware of potential FCRA litigation risk concerning background check disclosure issues because template disclosures and notices are frequently provided by third-parties.

Noteworthy Executive and Agency Action in the Financial Privacy Space

The Biden Administration engaged in a number of executive actions in 2021 that impacted the financial privacy sphere.  One of these notable executive actions was President Biden’s July 9, 2021, Executive Order entitled “Promoting Competition in the American Economy.” Lurking behind the seemingly economic-based title are a number of privacy-centric regulations.

For instance, the Order instructs the Federal Trade Commission (“FTC”) to use its rulemaking authority to promulgate additional regulations addressing “unfair data collection and surveillance practices that may damage competition, consumer autonomy, and consumer privacy.”  This potentially years-long rulemaking process will focus, in part, on safeguarding the acquisition and transfer of consumer data in mergers and transactions.  Interestingly, the Order simultaneously directs the Consumer Financial Protection Bureau (“CFPB”) to issue rules allowing for data portability of consumers’ banking data to make it easier for consumers to switch financial institutions.

While executive orders set a roadmap for future areas of agency action, agencies like the FTC were already busy enacting and enforcing new privacy policies.  For its part, the FTC issued a new enforcement policy statement warning companies that it is ramping up enforcement in response to a rising number of complaints about the financial harms caused by deceptive sign up tactics, unauthorized charges, and ongoing charges that are especially burdensome to cancel.  In particular, the enforcement policy condemned negative option offers which are, in other words, when a company interprets a consumer’s silence as acceptance or continuing acceptance of an offer.  This new FTC enforcement policy might affect, for example, those companies that utilize automatic renewals or free-to-pay offer structures.

In contrast, in 2021 the CFPB slowed down the pace of its public enforcement actions. Hearkening back to 2015, the CFPB was busy, bringing a total of 57 public enforcement actions.  That number declined for the next few years, with only 42 actions in 2016, 38 actions in 2017, and 11 actions in 2018, but experienced a slight uptick in 2019 (22 enforcement actions) and 2020 (48 enforcement actions).  In sharp contrast to the soft ebb and flow seen in the last few years, the number of CFPB public enforcement actions more than halved in 2021 to a mere 18 enforcement actions, the second lowest number in over half a decade.  However, this number may be set for an uptick in 2022 now that Rohit Chopra has been confirmed as CFPB Director and as financial privacy remains a federal priority.

Conclusion

2021 proved to be a year full of consequential developments to the financial privacy space.  Before the first half of 2021 was over, the Supreme Court had issued its monumental Ramirez decision.  That opinion will change the way that litigants, especially class action litigants, approach financial privacy cases involving statutory violations.  Courts, too, continue to grapple with the effects of Ramirez, with some federal courts, like the Eleventh Circuit, reevaluating pending cases, while other federal courts attempt to distinguish Ramirez or limit its application.

Meanwhile, state courts brace for a potential wave of privacy cases in 2022.  The executive branch also demonstrated a keen interest in shaping privacy policy, as the Biden Administration promulgated several key executive orders, while agencies on the ground ramped up enforcement to address potential privacy violations.  While it is hard to know exactly what 2022 holds in store for privacy practitioners, companies, and litigations, the important shifts in privacy law and policy in 2021 are sure to shape the privacy landscape in 2022 and, likely, for years to come.  Not to worry, CPW will be there to keep you in the loop.