Readers of CPW know that our very own Lydia de la Torre has been selected to be an inaugural board member of the new California Privacy Protection Agency.   Listen to what Lydia and Alan Friel, Deputy Chair of SPB’s Data Privacy group have to say in a must-listen to podcast.  They discuss the history of privacy policy, the growing influence of European privacy principles, and the new privacy laws we are seeing, or can expect, at the state and federal levels here in the United States.  Absolutely essential stuff for anyone working in an industry impacted by this growing body of law.  Listen to it at Tech Freedom here.

And for more on all developments data privacy related, stay tuned.  CPW will keep you in the loop.

We congratulate our friend and colleague Lydia de la Torre on her appointment to the inaugural board for the California Privacy Protection Agency.  “Californians deserve to have their data protected and the individuals appointed today will bring their expertise in technology, privacy and consumer rights to advance that goal,” said Governor Newsom. “These appointees [including Lydia] represent a new day in online consumer protection and business accountability.”

In 2018, California became the first state in the U.S. to equip consumers with new privacy tools and new privacy rights under the California Consumer Privacy Act. On November 3, 2020, California voters approved Proposition 24, the California Privacy Rights Act (CPRA), which created the California Privacy Protection Agency. Enforcement of the CPRA will begin in 2023.  The California Privacy Protection Agency will have full administrative power, authority, and jurisdiction to implement and enforce the California Consumer Privacy Act and the California Privacy Rights Act. The board of the CPPA will appoint the agency’s executive director, officers, counsel and employees. The agency may bring enforcement actions related to the CCPA or CPRA before an administrative law judge. The Attorney General will retain civil enforcement authority over the CCPA and the CPRA.

“The California Privacy Protection Agency marks a historic new chapter in data privacy by establishing the first agency in the country dedicated to protecting forty million Californians’ fundamental privacy rights,” said Attorney General Becerra. “The CPPA Board will help California residents understand and control their data privacy while holding online businesses accountable.”

“The chance to serve on the Board of the new California Privacy Protection Agency is a great opportunity for Lydia, and one for which she is exceptionally well suited given her diverse background and talents.  She has uniquely balanced an academic and private practice career, and public service is a natural next step for her” said Alan Friel, Deputy Chair of Squire Patton Boggs’ Global Data Privacy & Cybersecurity Practice.  “We could not be happier for her and commend Senator Atkins on the selection of such a qualified individual.  While we are sorry to see Lydia go, her selection continues a long tradition of public service by our attorneys, which our firm fully embraces.”

As readers of CPW already know, in a development that will bring dramatic changes to the California data privacy realm, on November 3, 2020, a majority of Californians voted to approve a new ballot initiative – Proposition 24, or the “California Privacy Rights Act of 2020” (“CPRA”).  You can read the fantastic analysis prepared by CPW’s Lydia de la Torre, Glenn A. Brown, Elliot Golding and Ann J. LaFrance here.

Well folks, one of the main changes brought about by the California Privacy Rights Act is the establishment of the California Privacy Protection Agency (“CPPA”) as an “independent watchdog” whose mission is both to “vigorously enforce” the CPRA and “ensure that businesses and consumers are well‐informed about their rights and obligations.”  Following up on that initial piece, Lydia de la Torre and Glenn A. Brown prepared an incredible, must read analysis as to how, with passage of the CPRA, “the CPPA is set to become a key privacy regulator not only in California, but across the U.S. and the globe”.  Check it out here.


On December 9, 2021, Alan Friel, Co-Chair of the SPB Global Data Privacy, Cybersecurity & Digital Assets Practice, led a fireside chat between U.S. Congresswoman Suzan DelBene and Alastair Mactaggart, as part of the session on Privacy, Security, Data Protection and Trust at the International Institute of Communications’ (“IIC”) Washington DC Telecommunications & Media Forum (“TMF”).  A recording of the discussion is available here.

Congresswoman DelBene serves as the Vice Chair of the powerful House Ways and Means Committee, is the co-chair of the Women’s High-Tech Coalition, and has introduced a federal consumer privacy legislation, the Information Transparency & Personal Data Control Act (“H.R. 1816”).  Mr. Mactaggart is the force behind California’s privacy laws and is the Board Chair and Founder of the Californians for Consumer Privacy, the organization that sponsored Proposition 24 (the California Privacy Rights Act or the “CPRA”) and the California Consumer Privacy Act of 2018 (“CCPA”).

The panelists discussed recently enacted U.S. state privacy laws and Congresswoman DelBene’s privacy bill, H.R. 1816, which was referred to the Subcommittee on Consumer Protection and Commerce in March 2021.  While the two policymakers agreed on the importance of consumer privacy legislation, their points of view on what that should mean for consumers and businesses diverged, and a spirited debate ensued.  Highlights are as follows:

A National Privacy Standard?

The panelists agreed it would be valuable to have a national privacy standard for safeguarding consumers’ personal data.  Congresswoman DelBene explained that a national privacy standard would:

  • curtail consumer confusion by making it so that consumers’ privacy rights do not change as much as they currently do when consumers travel from state to state;
  • alleviate the burden on businesses, especially small businesses, who may have to use considerable resources to comply with the requirements of each state privacy law; and
  • help to establish the U.S. as a key player in shaping global privacy policy—the Congressperson expressed that it is challenging for the U.S. to weigh in on international privacy issues when we lack a unified national standard.

Mr. Mactaggart agreed, explaining that a national privacy standard would grant privacy protections to people around the country.  However, he raised that H. R. 1816 in its current form would preempt state privacy laws by prohibiting states from adopting, enforcing (or continuing to enforce) laws and regulations related to data privacy, with exceptions.  Mr. Mactaggart recommended that a national privacy standard “should be a floor, not a ceiling,” and should not preempt stricter, non-conflicting state laws so states have an opportunity to strengthen privacy protections to meet the needs of their constituents.  He pointed to the Health Insurance Portability and Accountability Act (“HIPAA”) and Sarbanes-Oxley Act of 2002 as examples of federal laws that have created legal baselines by establishing minimum consumer protection requirements while also allowing states to strengthen protections for their constituents.

Transparency and Enabling Choice Regarding Use

H.R. 1816, as currently drafted, does not include an express right of access (other than with respect to sensitive information), transportable copies, or rights of correction or deletion.  The Congressperson explained that her intent was to propose a bill focused on fundamental policy and consumer rights that sets a solid foundation on which federal legislators can continue building.  Mr. Mactaggart expressed that although he understands the Congressperson’s goal, the effect of H.R. 1816 (which, in its current form, preempts state laws) would be to deprive consumers in states with existing privacy laws (e.g., California) of  rights they currently enjoy.  For example, according to Mr. Mactaggart, passing H.R. 1816 as currently drafted would deprive Californians of their rights to see, delete, or correct their information, among other things.  He recommended that a national privacy standard not remove existing privacy protections granted to consumers under state laws.

Scope of Rulemaking

The panelists agreed that an independent agency should be granted rulemaking authority. H.R. 1816, if passed, would grant rulemaking authority to the Federal Trade Commission (FTC) for privacy issues.  In California, the California Privacy Protection Agency (CPPA) has rulemaking authority for privacy.

Enforcement Authorities and Penalties for Non-Compliance

The panelists agreed that the FTC is the most qualified federal agency to lead privacy enforcement.  H.R. 1816, if passed, would be enforceable by both the FTC, a federal agency that has experience and expertise to lead meaningful privacy enforcement, and state attorneys general (but only if the FTC has not acted).  In California, the CPPA has administrative enforcement authority to enforce the CCPA/CPRA.

Private Right of Action

The panelists agreed that granting a private right of action creates challenges for covered businesses.  The Congressperson explained that H.R. 1816 does not have a private right of action because the threat of litigation can be very costly, especially for small businesses.  Mr. Mactaggart agreed and clarified that although there is a private right of action under the CCPA, the right is limited to a specific subset of personal information, and only for instances where a business is negligent in its data security practices.

Public Policy Balance Between Transparency and Choice in Digital Advertising

The panelists agreed that advertising is an important tool in commerce, but that it should be balanced with consumer protection considerations.

  • Mactaggart advised that privacy laws should contemplate including a distinction between contextual advertising and behavioral advertising, which he believes to be a more invasive form of advertising.
  • The Congressperson added that consumers should have the ability to opt-in and opt-out of information sharing depending on the context of their relationship and interaction with a business, and consumers should be provided with tools to help them understand their privacy rights, such as privacy notices that are easy to understand.

Sensitive Personal Information

The panelists agreed that certain types of information are more sensitive, and therefore, should be subjected to a heightened protection standard.

  • The Congressperson explained that companies should be required to obtain affirmative express consent before they can collect and share sensitive personal information (e.g., financial information, health or genetic information, information about children, citizenship or immigration status, gender, religious beliefs, etc.).
  • Mactaggart added that in California, the new category of “sensitive information” was added to balance giving consumers meaningful privacy rights with the need to enable businesses to utilize data to provide services to consumers.

Where to go from here?

Interestingly, the Congressperson expressed an openness to learn more and noted that her bill was merely a first draft to get the legislative process moving and welcomed input from stakeholders.  Mr. Mactaggart offered to sit down with her staff.  Where this will go next is unclear, but it appears that the discussion will continue.  A recording of the discussion is available here.  The IIC/TMF also covered international privacy issues.  A blog post on that is available here.

2021 has been a monumental year in many ways, and consumer financial privacy litigation and enforcement was no exception.  In the executive branch, the Biden Administration focused on strengthening individual privacy protections and limiting the disclosure of sensitive data.  Meanwhile, the Supreme Court’s decision in TransUnion LLC v. Ramirez continues to have a long-lasting impact in the privacy class action sphere.  Read on to hear about some of the biggest changes in financial privacy in 2021, and what it means for individuals, businesses and litigants in the new year.

TransUnion LLC v. Ramirez Limits Article III Standing in FCRA Class Actions

The Supreme Court dramatically limited the availability of Article III standing for financial privacy litigations in TransUnion LLC v. Ramirez, 141 S. Ct. 2190 (2021).  In Ramirez, a putative class of individuals whose credit reports contained mistaken terrorist designations sued TransUnion under the Fair Credit Reporting Act (“FRCA”).  Out of 8,185 class members, only 1,853 had misleading credit files provided to third-party businesses by TransUnion.  For the remaining 6,332 members, TransUnion maintained erroneous files but did not disseminate them to third-parties.  The Supreme Court held that class members whose credit files TransUnion provided to third-party businesses suffered a concrete harm akin to the common law tort of defamation, conferring Article III standing.  According to the Court, however, the remaining class members whose files were not released did not suffer a concrete harm and thus lacked standing.

In considering what constitutes an “injury in fact” under Article III, the Supreme Court held that “[o]nly plaintiffs concretely harmed by a defendant’s statutory violation have Article III standing to seek damages against the private defendant in federal court.”  The Court found that “Article III standing requires a concrete injury even in the context of a statutory violation.”  It is not the case, the Court clarified, that “a plaintiff automatically satisfies the injury-in-fact requirement whenever a statute grants a person a statutory right and purports to authorize that person to sue to vindicate that right.” (emphasis supplied).  The Court in Ramirez also held that in a class action for damages, class members must have Article III standing to recover.  The Court further held that a mere risk of future harm is not a concrete harm in a suit for damages.

What Are The Other Effects of Ramirez?

How else has Ramirez impacted financial privacy litigation?

First, some courts suggest that Ramirez’s application is limited earlier in the litigation process.  The court in In re Blackbaud, Inc., Customer Data Breach Litigation, 2021 U.S. Dist. LEXIS 123355 (D.S.C. July 1, 2021), considering a motion to dismiss, noted that Ramirez would be distinguishable for having a jury verdict.   Christian Labor Association v. City of Duluth, 2021 U.S. Dist. LEXIS 124289 (D. Minn. July 2, 2021), also suggested Ramirez’s applicability may be limited at the motion to dismiss stage.  However, numerous courts have applied Ramirez on a motion to dismiss.  This ambiguity in the procedural application of Ramirez is one to watch, especially when it comes to class certification.  Indeed, while the Court clarified that all class members seeking damages must establish standing, it expressly left open the question of whether every class member must demonstrate standing before a court certifies a class – an issue that lower courts have been grappling with in the wake of the Ramirez decision.

Second, the Ramirez decision raised concerns that states courts would be flooded with class actions—a “pyrrhic victory,” as Justice Clarence Thomas noted in his dissent.  So far, several courts have remanded putative financial privacy class actions to state courts.  In Lagrisola v. North American Financial Corp., 2021 U.S. Dist. LEXIS 192140 (S.D. Cal. Oct. 5, 2021), a federal court remanded a putative class action alleging violations of California law, and in Winters v. Douglas Emmett, Inc., 2021 U.S. Dist. LEXIS 124495 (C.D. Cal. July 2, 2021), the federal court remanded a putative FRCA class action.  Keep an eye on federal dockets in 2022 to see if these remands signal a growing trend, particularly in the Ninth Circuit.

Furthermore, some courts have attempted to contain Ramirez to defamation-adjacent actions.  For example, the court in Mastel v. Miniclip SA, 2021 U.S. Dist. LEXIS 132401 (E.D. Cal. July 15, 2021), found an injury in fact akin to invasion of privacy, not defamation, so Ramirez didn’t apply.  Similarly, the court in Lupia v. Medicredit, Inc., 8 F.4th 1184 (10th Cir. 2021), permitted a FDCPA claim to proceed, finding an injury in fact similar to intrusion upon seclusion.  In contrast, some courts have denied standing in cases where the defendant failed to disseminate private information, analogizing to defamation.  As a result, we may see a trend of plaintiffs arguing that their underlying harm resembles a tort other than defamation to uphold Article III standing.

On a related note, while commentators worried that Ramirez would preclude data breach litigations (including cases involved the alleged disclosure of personal financial information) from being brought in federal courts, such concerns have not yet materialized.  The courts in Blackbaud and Cotter v. Checkers Drive-In Restaurants, Inc., 2021 U.S. Dist. LEXIS 160592 (M.D. Fla. Aug. 25, 2021), distinguished Ramirez on procedural grounds.  Meanwhile, some courts have indicated that an impending injury or substantial risk could suffice for injury in fact in data breach litigation.  The court in Griffey v. Magellan Health Inc., 20210 U.S. Dist. LEXIS 184591 (D. Az. Sep. 27, 2021), found that plaintiffs alleged risks of future harm that were “certainly impending” and thus had standing.  All in all, however, pleading a data incident without something more probably does not survive a motion to dismiss.  That’s what happened in Legg v. Leaders Life Ins. Co., 2021 U.S. Dist. LEXIS 232833 (W.D. Okla. Dec. 6, 2021), where plaintiffs’ allegations of general risks of harm did not suffice.

Eleventh Circuit to Address Article III Standing in Wake of Ramirez After Whiplash in Hunstein v. Preferred Collection and Management Services, Inc.

In April, the Eleventh Circuit held in Hunstein v. Preferred Collection and Management Services, Inc., 994 F.3d 1341 (11th Cir. 2021), that the transmittal of a debtor’s personal information to a third-party mailing service violated section 1692c(b) of the Fair Debt Collection Practices Act (“FDCPA”).  In Hunstein I, Plaintiff incurred a hospital debt resulting from his son’s medical treatment.  The hospital assigned the debt to a debt collector, who hired a commercial mail vendor, transmitting personal information about Plaintiff along the way.  The Eleventh Circuit held that Plaintiff had suffered a concrete statutory injury sufficient for Article III standing, even though he had not suffered a “tangible harm” or even a “risk of real harm.”

In October, following Ramirez, the Eleventh Circuit vacated its opinion in Hunstein I but doubled-down on its original holdings.  The Eleventh Circuit held that the plaintiff suffered an intangible but concrete injury, analogizing the disclosure of his personal information to the common law tort of public disclosure of private facts.  Shortly thereafter in November, the Eleventh Circuit once again vacated Hunstein II and ordered a rehearing en banc, which has yet to occur.

In the meantime, the impact of Hunstein remains unclear.  Hunstein only binds courts within the Eleventh Circuit—but that doesn’t mean that other courts don’t take note of how the Eleventh Circuit subsequently rules.

For example, in Keller v. Northstar Locations Services, 2021 U.S. Dist. LEXIS 157820 (N.D. Ill. Aug. 20, 2021), and Thomas v. Unifin, Inc., 2021 U.S. Dist. LEXIS 157814 (N.D. Ill. Aug. 20, 2021), the Northern District of Illinois denied motions to remand individual FDCPA actions, arguing that disclosing information about debt to unauthorized third parties resembles invasion of privacy torts.  However, the Eastern District of New York dismissed six mailing vendor class actions in In re FDCPA Vendor Cases, 2021 U.S. Dist. LEXIS 139848 (E.D.N.Y. July 23, 2021), rejecting Hunstein and finding no injury in fact.

Other Financial Privacy Litigation Trends

More broadly, the number of consumer financial privacy cases filed in 2021 continued a year over year increase.  For example, according to Lex Machina and LexisNexis statistics, the number of FCRA litigations nearly tripled over the last decade with the number of filings continuing to rise compared to 2020.  Litigation under the Telephone Consumer Protection Act (“TCPA”) also remained at a high level (for more on this, be sure to check out

One trend in FCRA litigation is a rising number of claims brought against employers in the background check context.  As shown by some recent cases, many prospective employers are not aware of potential FCRA litigation risk concerning background check disclosure issues because template disclosures and notices are frequently provided by third-parties.

Noteworthy Executive and Agency Action in the Financial Privacy Space

The Biden Administration engaged in a number of executive actions in 2021 that impacted the financial privacy sphere.  One of these notable executive actions was President Biden’s July 9, 2021, Executive Order entitled “Promoting Competition in the American Economy.” Lurking behind the seemingly economic-based title are a number of privacy-centric regulations.

For instance, the Order instructs the Federal Trade Commission (“FTC”) to use its rulemaking authority to promulgate additional regulations addressing “unfair data collection and surveillance practices that may damage competition, consumer autonomy, and consumer privacy.”  This potentially years-long rulemaking process will focus, in part, on safeguarding the acquisition and transfer of consumer data in mergers and transactions.  Interestingly, the Order simultaneously directs the Consumer Financial Protection Bureau (“CFPB”) to issue rules allowing for data portability of consumers’ banking data to make it easier for consumers to switch financial institutions.

While executive orders set a roadmap for future areas of agency action, agencies like the FTC were already busy enacting and enforcing new privacy policies.  For its part, the FTC issued a new enforcement policy statement warning companies that it is ramping up enforcement in response to a rising number of complaints about the financial harms caused by deceptive sign up tactics, unauthorized charges, and ongoing charges that are especially burdensome to cancel.  In particular, the enforcement policy condemned negative option offers which are, in other words, when a company interprets a consumer’s silence as acceptance or continuing acceptance of an offer.  This new FTC enforcement policy might affect, for example, those companies that utilize automatic renewals or free-to-pay offer structures.

In contrast, in 2021 the CFPB slowed down the pace of its public enforcement actions. Hearkening back to 2015, the CFPB was busy, bringing a total of 57 public enforcement actions.  That number declined for the next few years, with only 42 actions in 2016, 38 actions in 2017, and 11 actions in 2018, but experienced a slight uptick in 2019 (22 enforcement actions) and 2020 (48 enforcement actions).  In sharp contrast to the soft ebb and flow seen in the last few years, the number of CFPB public enforcement actions more than halved in 2021 to a mere 18 enforcement actions, the second lowest number in over half a decade.  However, this number may be set for an uptick in 2022 now that Rohit Chopra has been confirmed as CFPB Director and as financial privacy remains a federal priority.


2021 proved to be a year full of consequential developments to the financial privacy space.  Before the first half of 2021 was over, the Supreme Court had issued its monumental Ramirez decision.  That opinion will change the way that litigants, especially class action litigants, approach financial privacy cases involving statutory violations.  Courts, too, continue to grapple with the effects of Ramirez, with some federal courts, like the Eleventh Circuit, reevaluating pending cases, while other federal courts attempt to distinguish Ramirez or limit its application.

Meanwhile, state courts brace for a potential wave of privacy cases in 2022.  The executive branch also demonstrated a keen interest in shaping privacy policy, as the Biden Administration promulgated several key executive orders, while agencies on the ground ramped up enforcement to address potential privacy violations.  While it is hard to know exactly what 2022 holds in store for privacy practitioners, companies, and litigations, the important shifts in privacy law and policy in 2021 are sure to shape the privacy landscape in 2022 and, likely, for years to come.  Not to worry, CPW will be there to keep you in the loop.


Registration is open for a series of upcoming not-to-be-missed webinars covering key areas for companies seeking to regulate the global compliance landscape.  Register below for insights from CPW’s Alan Friel, Marisol Mork, Eric Troutman and others.

Webinar Series: Advertising, Media and Brands – Global Compliance Challenges

2021 has provided unique challenges for businesses operating across the advertising, media and brands industry. Aside from the impact of the pandemic, we are seeing a changing and challenging landscape due to increasing economic, consumer, regulatory and compliance pressures.

With increased exposure as a result of these pressures, Squire Patton Boggs and BDO will be hosting four webinars to support the advertising, media and brands industry in navigating these challenges:

  • November 11, 2021 – Global Data, Technology and Tax
  • November 30, 2021 – M&A Landscape, Post-COVID-19 Transaction Trends and Tips, and Top Five Due Diligence Risks
  • January 12, 2022 – Global Anti-counterfeiting and Brand Protection Trends, and Top Five AMB Hot Topics
  • February 2, 2022 – The Rise of ESG and Global Workplace Challenges

Hosted by Squire Patton Boggs and BDO

Click here to register.

Conference: ANA/BAA Marketing Law Conference (In-Person and Virtual)

Nov. 15-17, 2021: San Diego

Session: California Privacy: What Direction Next From CCPA and CRPA?

Alan Friel (Squire Patton Boggs) will review California’s privacy laws with representatives from the California Privacy Protection Agency and the OAG.

Session: State and Local Attorney General Enforcement updates by Marisol Mork (Squire Patton Boggs)

Session: TCPA updates by Eric Troutman (Squire Patton Boggs)

Hosted by ANA.

Click here to register.

This week, thousands of (fully vaccinated) privacy lawyers and professionals will descend upon San Diego to attend the International Association of Privacy Professionals (IAPP) Privacy.Security.Risk 2021 Conference. If you are attending P.S.R., please consider two breakout sessions moderated by Alan Friel and Kyle Fath from Squire Patton Boggs’ Global Data Privacy, Cybersecurity and Digital Assets Practice.

Alan, recently named Co-Chair of SPB’s Global Data Practice, will be moderating a panel on “Data For Good: Empowering Innovation Through Ethical Uses of Data” with Jules Polonetsky, CEO, Future of Privacy Forum, Lydia de la Torre, former Squire Patton Boggs attorney and Board Member of the California Privacy Protection Agency, and Barbara Lawler, COO and Sr. Strategist, The Information Accountability Foundation on Thursday, Oct. 21 at 4 PM PDT.

Kyle Fath, Of Counsel, will be moderating a panel entitled “The Cookieless Future: What It Means & How to Proactively Address Privacy Issues” on Thursday, Oct. 21 at 2:30 PM PDT. Joining Kyle on the panel are Beatrice Botti, Global Data & Privacy Officer, DoubleVerify, Julia Shullman, General Counsel and Chief Privacy Officer of TripleLift, and Brendan Smith, Founder and CEO of Enigma Data.

The Federal Trade Commission (FTC) has made it clear: data privacy and cybersecurity are now a priority, and will be for years to come. In the wake of PrivacyCon 2021, the FTC’s sixth annual privacy, cybersecurity and consumer protection summit, held this summer, the FTC finally took official and sweeping action on privacy and cybersecurity. In particular, the Commission recently designated eight key areas of focus for enforcement and regulatory action, three of which directly implicate privacy, cybersecurity, and consumer protection. Below, we discuss the FTC’s action and what it means for businesses, the three key areas of interest to consumer privacy that are now in the FTC’s spotlight, as well as their relation to state privacy legislation and their anticipated impact to civil litigation. Full details on PrivacyCon 2021 and the FTC’s resolutions following the summit can be found on the FTC’s website, linked here for your convenience.

The FTC’s Actions and Areas of Focus

In mid-September, the FTC voted to approve a series of resolutions, directed at key enforcement areas, including the following, each discussed in further detail below:

  • Children Under 18: Harmful conduct directed at children under 18 has been a source of significant public concern, now, FTC staff will similarly be able to expeditiously investigate any allegations in this important area.
  • Algorithmic and Biometric Bias: Allows staff to investigate allegations of bias in algorithms and biometrics. Algorithmic bias was the subject of a recent FTC blog.
  • Deceptive and Manipulative Conduct on the Internet: This includes, but is not limited to, the “manipulation of user interfaces,” including but not limited to dark patterns, also the subject of a recent FTC workshop.

The approval of this series of resolutions will enable the Commission “to efficiently and expeditiously investigate conduct in core FTC priority areas. Through the passage of the resolutions, the FTC has now directed that all “compulsory processes” available to it be used in connection with COPPA enforcement. This omnibus resolution mobilizes the full force of the FTC for the next ten years and gives FTC staff full authority to conduct investigations and commence enforcement actions in pursuit of this goal. The FTC has offered very little elaboration on this front, however, regarding how it will use such “compulsory processes,” which include subpoenas, civil investigative demands, and other demands for documents or testimony.

What does seems clear, however, is that the FTC is buckling down on the enforceability of its own actions. Previous remarks by Chair Lina M. Khan before the House Energy and Commerce Committee expressed frustration at the frequent hamstringing of the agency at the hands of courts in its enforcement efforts in the past. With this declaration of renewed energy, the FTC is summoning all the power it can to do its job, and we should expect to see an energized FTC kick up its patrol efforts in the near future. Businesses that conduct activities that implicate these renewed areas should be aware of the FTC’s focus and penchant for investigations and enforcement in such areas.

Children Under 18

The FTC’s mandate to focus on harmful conduct directed at children under 18 is a signal that the Commission plans on broadening and doubling down on its already active enforcement efforts in this area. Areas of the Commission’s prior and current focus on children include marketing claims, loot boxes and other virtual items that can be purchased in games, and in-app and recurring purchases made by children without parental authorization. Most importantly, the FTC is the main arbiter of children’s online privacy through its enforcement of the Children’s Online Privacy Protection Act (“COPPA”), but that law only applies to children under 13 (i.e., 12 and under).  With this new proviso to focus on children under 18, we can certainly expect the FTC to focus on consumer privacy issues, broader than COPPA, for children from ages 13 to 17 as well.

Algorithmic and Biometric Bias

The FTC already has enforcement capabilities to regulate the development and use of artificial intelligence (“AI”) and its associated algorithms. These include the Section 5 of the FTC Act, which prohibits “unfair or deceptive acts or practices,” the Fair Credit Reporting Act, which rears its head when algorithms impact lenders’ decisions to provide credit, and the Equal Opportunity Credit Act, which prohibits the use of biased algorithms that discriminate on the basis of race, color, sex, age, and so on when making credit determinations. In using these tools, the FTC aims to clarify how algorithms are used and how the data that feeds them contributes to algorithmic output, and to bring to light issues that arise when algorithms don’t work or feed on improper biases.

Bias and discrimination arising from use of biometrics will also now be a focus of the FTC. Interestingly, much recent research and criticism has pointed out that algorithms and biometric systems are biased against faces of color. This has arisen in many contexts, from the iPhone’s FaceID feature to the 2020 remotely-administered bar exam that threatened to fail applicants of color because their webcams could not detect their faces. These are just some of the issues that arise when companies turn to algorithms to try to create heuristics in making business decisions. The FTC has not let these concerns go by the wayside, and after preliminarily addressing them in an April 2021 blog post, has now reestablished that algorithmic and biometric bias is a new focus for the upcoming years.

Notably, AI and other automated decision-making, particularly that which results in legal and/or discriminatory effects, will also become regulated under omnibus privacy legislation in California, Virginia, and Colorado, forthcoming in 2023.

Deceptive and Manipulative Conduct on the Internet (Including “Dark Patterns”)

The sinisterly-nicknamed practice of “dark patterns” happens constantly to online consumers, albeit in ways that tend to seem benign. For example, shoppers contemplating items in their cart may be pressured to complete the sale if they receive a notification like, “Hurry, three other people have this in their cart!” More annoyingly, online consumers who wish to unsubscribe to newsletters or email blasts may find themselves having to click through multiple pages just to free their inboxes, rather than an easily-identifiable and quickly-accessible “unsubscribe” button. “Dark patterns” is the term coined for these sorts of techniques, which impair consumers’ autonomy and create traps for online shoppers.

Earlier this year, the FTC hosted a workshop called “Bringing Dark Patterns to Light,” and sought comments from experts and the public to evaluate how these dark patterns impact customers. The FTC was particularly concerned with harms caused by these dark patterns, and how dark patterns may take advantage of certain groups of vulnerable consumers. The FTC is not alone in its attention to this issue; in March, California’s Attorney General announced regulations that banned dark patterns and required disclosure to consumers of the right to opt-out of the sale of personal information collected through online cookies. These regulations also prohibit companies from requiring consumers who wish to opt out to click through myriads of screens before achieving their goals. On the opposite coast, the weight-loss app Noom now faces a class action alleging deceptive acts through Noom’s cancellation policy, automatic renewal schemes, and marketing to consumers.

With both public and private entities turning their eyes toward dark patterns, the FTC has now declared the agency will put its full weight behind seeking out and investigating “unfair, deceptive, anticompetitive, collusive, coercive, predatory, exploitative, or exclusionary acts or practices…including, but not limited to, dark patterns…” Keeping an eye on this work will be important—just as important as keeping an eye on which cookies you accept, and which are best to just let go stale.

In addition to being in the crosshairs of the FTC, dark patterns are also a focus of regulators across the globe, including in Europe, and will be regulated under California’s forthcoming California Privacy Rights Act.

Anticipated Litigation Trends

With the FTC declaring its intent to vigorously investigate these three aforementioned areas, we now turn to what the agency’s new enforcement priorities mean for civil litigation. As practitioners in this field already know, it is unlikely that they will result in an influx of new litigations. The FTC’s enforcement authority exists pursuant to Section 5(a) of the FTC Act, which outlaws “unfair or deceptive acts or practices in or affecting commerce,” but does not contain a private right of action – so plaintiffs cannot technically bring new suits based on the new enforcement priorities, as they have no private right to enforce those priorities.

However, these areas of focus could influence broader trends in civil litigation, even if, on their own, they do not create any new liability. Successful enforcement actions by the FTC could bring about new industry standards with respect to algorithmic bias, dark patterns, and other areas of focus. These standards, in turn, could be cited in consumer privacy class action complaints. New civil actions could also stem from enforcement actions by the FTC and the information revealed in settlements resulting from such actions. For example, the FTC announced a settlement with fertility-tracking app Flo Health Inc. in January; this month, a consolidated class action complaint was filed against Flo Health, stemming from seven proposed class actions filed against it this year, alleging that the app unlawfully shared users’ health information with third parties.

Although the FTC’s new enforcement priorities seem ambitious, recent developments may impede its capability to bring enforcement actions in these areas. The agency was dealt a blow in April of this year, when the Supreme Court ruled in AMG Capital Mgmt., LLC v. FTC that the agency lacks power to seek monetary recovery under Section 13 of the FTC Act. Legislation to restore this power to the agency passed the House, but is awaiting a Senate vote. More recently, the Senate voted to advance the nomination of Rohit Chopra, currently a Democratic Commissioner, to lead the Consumer Financial Protection Bureau. The White House announced that President Biden will nominate Alvaro Bedoya, a privacy scholar and expert with expertise in surveillance and data security, to fill Commissioner Chopra’s seat. As Commissioner, likely priorities for Bedoya include the FTC’s enforcement of various privacy laws, including the Fair Credit Reporting Act and the Gramm-Leach-Bliley Act, which could further impact litigations brought under those statutes.



As seasoned data privacy and biometric litigators are already aware, the United States does not have a comprehensive federal law regulating the collection, processing, disclosure, and security of personal information (“PI”)—typically defined as information that identifies, or is reasonably capable of being linked to, an individual.  Rather, a patchwork of federal and state sectoral laws

Unlike the European Union and many countries, the US does not have a holistic, comprehensive federal law generally regulating privacy and the collection, processing, disclosure and security of “personal information” (typically defined as information that identifies, relates to, describes, is reasonably capable of being linked to, a particular individual). Rather, a patchwork of sectoral federal