Registration is open for a series of upcoming not-to-be-missed webinars covering key areas for companies seeking to regulate the global compliance landscape.  Register below for insights from CPW’s Alan Friel, Marisol Mork, and others.

Webinar Series: Advertising, Media and Brands – Global Compliance Challenges

2021 has provided unique challenges for businesses operating across the advertising, media and brands industry. Aside from the impact of the pandemic, we are seeing a changing and challenging landscape due to increasing economic, consumer, regulatory and compliance pressures.

With increased exposure as a result of these pressures, Squire Patton Boggs and BDO will be hosting four webinars to support the advertising, media and brands industry in navigating these challenges:

  • November 11, 2021 – Global Data, Technology and Tax
  • November 30, 2021 – M&A Landscape, Post-COVID-19 Transaction Trends and Tips, and Top Five Due Diligence Risks
  • January 12, 2022 – Global Anti-counterfeiting and Brand Protection Trends, and Top Five AMB Hot Topics
  • February 2, 2022 – The Rise of ESG and Global Workplace Challenges

Hosted by Squire Patton Boggs and BDO

Click here to register.

Conference: ANA/BAA Marketing Law Conference (In-Person and Virtual)

Nov. 15-17, 2021: San Diego

Session: California Privacy: What Direction Next From CCPA and CRPA?

Alan Friel (Squire Patton Boggs) will review California’s privacy laws with representatives from the California Privacy Protection Agency and the OAG.

Session: State and Local Attorney General Enforcement updates by Marisol Mork (Squire Patton Boggs)

Hosted by ANA.

Click here to register.

This week, thousands of (fully vaccinated) privacy lawyers and professionals will descend upon San Diego to attend the International Association of Privacy Professionals (IAPP) Privacy.Security.Risk 2021 Conference. If you are attending P.S.R., please consider two breakout sessions moderated by Alan Friel and Kyle Fath from Squire Patton Boggs’ Global Data Privacy, Cybersecurity and Digital Assets Practice.

Alan, recently named Co-Chair of SPB’s Global Data Practice, will be moderating a panel on “Data For Good: Empowering Innovation Through Ethical Uses of Data” with Jules Polonetsky, CEO, Future of Privacy Forum, Lydia de la Torre, former Squire Patton Boggs attorney and Board Member of the California Privacy Protection Agency, and Barbara Lawler, COO and Sr. Strategist, The Information Accountability Foundation on Thursday, Oct. 21 at 4 PM PDT.

Kyle Fath, Of Counsel, will be moderating a panel entitled “The Cookieless Future: What It Means & How to Proactively Address Privacy Issues” on Thursday, Oct. 21 at 2:30 PM PDT. Joining Kyle on the panel are Beatrice Botti, Global Data & Privacy Officer, DoubleVerify, Julia Shullman, General Counsel and Chief Privacy Officer of TripleLift, and Brendan Smith, Founder and CEO of Enigma Data.

The Federal Trade Commission (FTC) has made it clear: data privacy and cybersecurity are now a priority, and will be for years to come. In the wake of PrivacyCon 2021, the FTC’s sixth annual privacy, cybersecurity and consumer protection summit, held this summer, the FTC finally took official and sweeping action on privacy and cybersecurity. In particular, the Commission recently designated eight key areas of focus for enforcement and regulatory action, three of which directly implicate privacy, cybersecurity, and consumer protection. Below, we discuss the FTC’s action and what it means for businesses, the three key areas of interest to consumer privacy that are now in the FTC’s spotlight, as well as their relation to state privacy legislation and their anticipated impact to civil litigation. Full details on PrivacyCon 2021 and the FTC’s resolutions following the summit can be found on the FTC’s website, linked here for your convenience.

The FTC’s Actions and Areas of Focus

In mid-September, the FTC voted to approve a series of resolutions, directed at key enforcement areas, including the following, each discussed in further detail below:

  • Children Under 18: Harmful conduct directed at children under 18 has been a source of significant public concern, now, FTC staff will similarly be able to expeditiously investigate any allegations in this important area.
  • Algorithmic and Biometric Bias: Allows staff to investigate allegations of bias in algorithms and biometrics. Algorithmic bias was the subject of a recent FTC blog.
  • Deceptive and Manipulative Conduct on the Internet: This includes, but is not limited to, the “manipulation of user interfaces,” including but not limited to dark patterns, also the subject of a recent FTC workshop.

The approval of this series of resolutions will enable the Commission “to efficiently and expeditiously investigate conduct in core FTC priority areas. Through the passage of the resolutions, the FTC has now directed that all “compulsory processes” available to it be used in connection with COPPA enforcement. This omnibus resolution mobilizes the full force of the FTC for the next ten years and gives FTC staff full authority to conduct investigations and commence enforcement actions in pursuit of this goal. The FTC has offered very little elaboration on this front, however, regarding how it will use such “compulsory processes,” which include subpoenas, civil investigative demands, and other demands for documents or testimony.

What does seems clear, however, is that the FTC is buckling down on the enforceability of its own actions. Previous remarks by Chair Lina M. Khan before the House Energy and Commerce Committee expressed frustration at the frequent hamstringing of the agency at the hands of courts in its enforcement efforts in the past. With this declaration of renewed energy, the FTC is summoning all the power it can to do its job, and we should expect to see an energized FTC kick up its patrol efforts in the near future. Businesses that conduct activities that implicate these renewed areas should be aware of the FTC’s focus and penchant for investigations and enforcement in such areas.

Children Under 18

The FTC’s mandate to focus on harmful conduct directed at children under 18 is a signal that the Commission plans on broadening and doubling down on its already active enforcement efforts in this area. Areas of the Commission’s prior and current focus on children include marketing claims, loot boxes and other virtual items that can be purchased in games, and in-app and recurring purchases made by children without parental authorization. Most importantly, the FTC is the main arbiter of children’s online privacy through its enforcement of the Children’s Online Privacy Protection Act (“COPPA”), but that law only applies to children under 13 (i.e., 12 and under).  With this new proviso to focus on children under 18, we can certainly expect the FTC to focus on consumer privacy issues, broader than COPPA, for children from ages 13 to 17 as well.

Algorithmic and Biometric Bias

The FTC already has enforcement capabilities to regulate the development and use of artificial intelligence (“AI”) and its associated algorithms. These include the Section 5 of the FTC Act, which prohibits “unfair or deceptive acts or practices,” the Fair Credit Reporting Act, which rears its head when algorithms impact lenders’ decisions to provide credit, and the Equal Opportunity Credit Act, which prohibits the use of biased algorithms that discriminate on the basis of race, color, sex, age, and so on when making credit determinations. In using these tools, the FTC aims to clarify how algorithms are used and how the data that feeds them contributes to algorithmic output, and to bring to light issues that arise when algorithms don’t work or feed on improper biases.

Bias and discrimination arising from use of biometrics will also now be a focus of the FTC. Interestingly, much recent research and criticism has pointed out that algorithms and biometric systems are biased against faces of color. This has arisen in many contexts, from the iPhone’s FaceID feature to the 2020 remotely-administered bar exam that threatened to fail applicants of color because their webcams could not detect their faces. These are just some of the issues that arise when companies turn to algorithms to try to create heuristics in making business decisions. The FTC has not let these concerns go by the wayside, and after preliminarily addressing them in an April 2021 blog post, has now reestablished that algorithmic and biometric bias is a new focus for the upcoming years.

Notably, AI and other automated decision-making, particularly that which results in legal and/or discriminatory effects, will also become regulated under omnibus privacy legislation in California, Virginia, and Colorado, forthcoming in 2023.

Deceptive and Manipulative Conduct on the Internet (Including “Dark Patterns”)

The sinisterly-nicknamed practice of “dark patterns” happens constantly to online consumers, albeit in ways that tend to seem benign. For example, shoppers contemplating items in their cart may be pressured to complete the sale if they receive a notification like, “Hurry, three other people have this in their cart!” More annoyingly, online consumers who wish to unsubscribe to newsletters or email blasts may find themselves having to click through multiple pages just to free their inboxes, rather than an easily-identifiable and quickly-accessible “unsubscribe” button. “Dark patterns” is the term coined for these sorts of techniques, which impair consumers’ autonomy and create traps for online shoppers.

Earlier this year, the FTC hosted a workshop called “Bringing Dark Patterns to Light,” and sought comments from experts and the public to evaluate how these dark patterns impact customers. The FTC was particularly concerned with harms caused by these dark patterns, and how dark patterns may take advantage of certain groups of vulnerable consumers. The FTC is not alone in its attention to this issue; in March, California’s Attorney General announced regulations that banned dark patterns and required disclosure to consumers of the right to opt-out of the sale of personal information collected through online cookies. These regulations also prohibit companies from requiring consumers who wish to opt out to click through myriads of screens before achieving their goals. On the opposite coast, the weight-loss app Noom now faces a class action alleging deceptive acts through Noom’s cancellation policy, automatic renewal schemes, and marketing to consumers.

With both public and private entities turning their eyes toward dark patterns, the FTC has now declared the agency will put its full weight behind seeking out and investigating “unfair, deceptive, anticompetitive, collusive, coercive, predatory, exploitative, or exclusionary acts or practices…including, but not limited to, dark patterns…” Keeping an eye on this work will be important—just as important as keeping an eye on which cookies you accept, and which are best to just let go stale.

In addition to being in the crosshairs of the FTC, dark patterns are also a focus of regulators across the globe, including in Europe, and will be regulated under California’s forthcoming California Privacy Rights Act.

Anticipated Litigation Trends

With the FTC declaring its intent to vigorously investigate these three aforementioned areas, we now turn to what the agency’s new enforcement priorities mean for civil litigation. As practitioners in this field already know, it is unlikely that they will result in an influx of new litigations. The FTC’s enforcement authority exists pursuant to Section 5(a) of the FTC Act, which outlaws “unfair or deceptive acts or practices in or affecting commerce,” but does not contain a private right of action – so plaintiffs cannot technically bring new suits based on the new enforcement priorities, as they have no private right to enforce those priorities.

However, these areas of focus could influence broader trends in civil litigation, even if, on their own, they do not create any new liability. Successful enforcement actions by the FTC could bring about new industry standards with respect to algorithmic bias, dark patterns, and other areas of focus. These standards, in turn, could be cited in consumer privacy class action complaints. New civil actions could also stem from enforcement actions by the FTC and the information revealed in settlements resulting from such actions. For example, the FTC announced a settlement with fertility-tracking app Flo Health Inc. in January; this month, a consolidated class action complaint was filed against Flo Health, stemming from seven proposed class actions filed against it this year, alleging that the app unlawfully shared users’ health information with third parties.

Although the FTC’s new enforcement priorities seem ambitious, recent developments may impede its capability to bring enforcement actions in these areas. The agency was dealt a blow in April of this year, when the Supreme Court ruled in AMG Capital Mgmt., LLC v. FTC that the agency lacks power to seek monetary recovery under Section 13 of the FTC Act. Legislation to restore this power to the agency passed the House, but is awaiting a Senate vote. More recently, the Senate voted to advance the nomination of Rohit Chopra, currently a Democratic Commissioner, to lead the Consumer Financial Protection Bureau. The White House announced that President Biden will nominate Alvaro Bedoya, a privacy scholar and expert with expertise in surveillance and data security, to fill Commissioner Chopra’s seat. As Commissioner, likely priorities for Bedoya include the FTC’s enforcement of various privacy laws, including the Fair Credit Reporting Act and the Gramm-Leach-Bliley Act, which could further impact litigations brought under those statutes.

 

 

As seasoned data privacy and biometric litigators are already aware, the United States does not have a comprehensive federal law regulating the collection, processing, disclosure, and security of personal information (“PI”)—typically defined as information that identifies, or is reasonably capable of being linked to, an individual.  Rather, a patchwork of federal and state sectoral laws

Unlike the European Union and many countries, the US does not have a holistic, comprehensive federal law generally regulating privacy and the collection, processing, disclosure and security of “personal information” (typically defined as information that identifies, relates to, describes, is reasonably capable of being linked to, a particular individual). Rather, a patchwork of sectoral federal

The California Privacy Protection Agency (CPPA) Board, created by the California Privacy Rights Act (CPRA), has been busy of late. As we recently reported, the CCPA has hired renowned privacy technologist Ashkan Soltani as its new Executive Director to lead the agency. Meanwhile, the agency’s committees have been hard at work. The Regulations Subcommittee has proposed its framework for its rulemaking process. Notably, the subcommittee recommends an immediate start to pre-rulemaking activities such as issuing an invitation for comments, the creation of additional subcommittees, and the identification of informational hearing topics. A pre-rulemaking process gives the agency flexibility to hear from stakeholders outside of the formal and constrained process that will begin once the regulatory process officially commences. The framework also notes that the notice of proposed rulemaking, initial statement of reasons (ISOR), and text of the regulations should be published in winter 2021-2022, with public hearings taking place thereafter. This suggests that stakeholders have a short window of opportunity to take advantage of the pre-regulatory educational period. It will be interesting to see if the agency conducts the kind of “listening tour” the Office of Attorney General (OAG) went on across the Golden State by means of town halls prior to its California Consumer Privacy Act (CCPA) rulemaking process, or elects to spend its time in more intimate and concerted explorations.

You can read more about this development at Security & Privacy Bytes, here.

The California Privacy Protection Agency (CPPA) Board, created by the California Privacy Rights Act (CPRA), has been busy of late. As we recently reported, the CCPA has hired renowned privacy technologist Ashkan Soltani as its new Executive Director to lead the agency. Meanwhile, the agency’s committees have been hard at work. The Regulations Subcommittee has proposed its framework for its rulemaking process. Notably, the subcommittee recommends an immediate start to pre-rulemaking activities such as issuing an invitation for comments, the creation of additional subcommittees, and the identification of informational hearing topics. A pre-rulemaking process gives the agency flexibility to hear from stakeholders outside of the formal and constrained process that will begin once the regulatory process officially commences. The framework also notes that the notice of proposed rulemaking, initial statement of reasons (ISOR), and text of the regulations should be published in winter 2021-2022, with public hearings taking place thereafter. This suggests that stakeholders have a short window of opportunity to take advantage of the pre-regulatory educational period. It will be interesting to see if the agency conducts the kind of “listening tour” the Office of Attorney General (OAG) went on across the Golden State by means of town halls prior to its California Consumer Privacy Act (CCPA) rulemaking process, or elects to spend its time in more intimate and concerted explorations. Continue Reading California Privacy Agency Moves Forward With Rulemaking Process

As reported in Law360, Ashkan Soltani “[a] prominent security researcher and former chief technologist at the Federal Trade Commission has been selected to lead the day-to-day operations at California’s new privacy protection agency, which will be the first authority in the U.S. to focus solely on policing how companies handle consumers’ personal data.” CPW’s Alan Friel, deputy chair of the data privacy, cybersecurity and digital assets practice at Squire Patton Boggs, told Law360 Monday that Soltani’s selection demonstrated the state’s commitment “to developing a first class data protection authority.”  “His credentials are simply beyond reproach, and having someone with Ashkan’s level of technical and regulatory agency experience will help the agency meet its goals and obligations,” said Friel, who’s based in California and specializes in counseling clients on compliance with the state’s privacy regimes.

Read the entire article here.

 

Yesterday, the California Privacy Protection Agency announced the appointment of Ashkan Soltani as Executive Director. Soltani is tasked with overseeing the Agency’s implementation of the California Privacy Act of 2018 (CCPA), as amended by theCalifornia Privacy Rights Act of 2020 (CPRA), as well as enforcement, rulemaking, and other agency operations. Soltani was a key figure in the creation of both of California’s sweeping privacy laws.

Soltani is regarded as an expert in the privacy and security space. He is a Distinguished Fellow at the Georgetown University Law School Institute for Technology Law & Policy and the Center on Privacy and Technology. Previously, he served as Chief Technologist at the Federal Trade Administration and as the White House Chief Technology Officer, where he advised on and helped to shape technology-related public policy. During his time with the FTC, Soltani assisted with the government’s investigations of Google and Facebook. After leaving the FTC, Soltani was engaged as a technology expert by a number of state consumer protection agencies, including being retained in prominent multi-state privacy and data security investigation by state attorneys general targeting large social media and technology companies.

Soltani also serves as a member of EFF’s advisory board, where he has collaborated on a number of efforts related to the government and third-party tracking areas. His research on tracking technologies was the basis for the Wall Street Journal’s award-winning “What They Know” series.

Soltani will lead the Agency effective immediately.

This article originally published on February 23, 2021, by the American Bar Association, and is republished here with permission. For more information visit www.americanbar.org.   

The article expands on our original report on the Virginia Consumer Data Protection Act published on February 2, 2021.

Computer securityIn the coming days, Governor Ralph Northam is expected to sign into law the Virginia Consumer Data Protection Act (the “Act”), which, if enacted, will become effective on January 1, 2023. As a result, Virginia would become the second state in the US to enact a holistic data privacy law that purports to regulate the collection, use and disclosure of the personal data of its residents generally.

Overview and Quick Take

In many ways, the Act is similar to the California Consumer Privacy Act (the “CCPA”), the first holistic data privacy law in the US, and to the California Privacy Rights Act (the “CPRA”), which was enacted by ballot referendum in November 2020. It also shares some concepts with the EU’s General Data Privacy Regulation (the “GDPR”).  However, it is sufficiently dissimilar to each of those laws that a business developing a compliance strategy for the Act will not be able to rely solely on its previous compliance efforts in complying with the Act.

Continue Reading Virginia Set to Become Second State to Enact Holistic Data Privacy Law