Scott Warren, Partner (Tokyo/Shanghai), will be speaking at PrivSec Focus: Enterprise Risk on May 18, 2022 at 15:10 PM (British Standard Time) on the topic “Creating a Robust Data Breach Management Policy”. During the global livestream, Scott will highlight the key elements of a data breach management policy, from detection to mitigation and review stages. The webinar is free to attend.

To sign-up, register at PrivSec’s event platform”.

PrivSec Focus: Enterprise Risk brings together thought leaders and industry experts in enterprise risk to explore how business can better protect itself against ever-changing and unpredictable threats.

In a resolution as of 24 March 2022, the Conference of German Supervisory Authorities in Data Protection (Datenschutzkonferenz – “DSK”) provided guidance for data protection-compliant online trading of goods and services. The key message is that online customers must be given the option of a guest access for their orders. According to the DSK, online traders (controllers) therefore must enable online purchasing without customers having to create an account.

The DSK recalled that the principle of data minimization also applies in online trading. Customers must be free to decide in each case if they want to enter their data for each order and thus be treated as a temporary guest, or if they are willing to enter into a permanent business relationship that is linked to an ongoing customer account.

The DSK is of the opinion that without guest access or an equivalent ordering option, consent would not be provided voluntary. An ordering option can be considered equivalent if it does not entail disadvantages for the customer. The effort required to order and access this option must be equivalent to that of a customer account.

The DSK further pointed out that a customer account allows online traders to evaluate the contract history for advertising purposes as well as to store information about means of payment. Such processing would require informed consent.

According to the DSK, there may nevertheless be special circumstances that justify the setting up of a customer account as necessary for the performance of a contract, e.g. for specialist retailers regarding certain professional groups. But even then, the principle of data minimization must be observed, e.g. by automatically deleting the customer account after a short period of inactivity.

In a follow up to our previous post “Privacy Continues to be Top of Mind Issue With President Biden’s State of the Union Address and Movement on FTC Nominee Today,” the Senate confirmed Alvaro Bedoya as a commissioner on the Federal Trade Commission which provides Democrats 3-2 control over the agency’s enforcement activities and objectives.  The Senate vote was a 50 Yeas/50 Nays split along party lines with Vice President Harris’ vote as the tiebreak.

President Biden had previously nominated Bedoya, a privacy scholar with interests in surveillance and data security, to fill Commissioner Chopra’s seat.  As Commissioner, likely priorities for Bedoya include the FTC’s enforcement of various privacy laws, including the Fair Credit Reporting Act and the Gramm-Leach-Bliley Act, which could further impact litigations brought under those statutes.

With a majority in hand and all commissioner seats filled, this may indicate a favorable time for Democrats to move forward with rulemaking, including in the realm of data privacy and cybersecurity.  As a reminder, earlier in the year Commissioner Wilson had indicated opposition to FTC action on data privacy, facial recognition and AI.  This was following the FTC’s December 2021 notice that it was “considering initiating a rulemaking under Section 18 of the FTC Act to curb lax security practices, limit privacy abuses, and ensure that algorithmic decision-making does not result in unlawful discrimination.”

For more on this, stay tuned.  The CPW team will continue to monitor FTC activities and provide you with relevant updates.

European and UK data protection laws present significant challenges to organizations whose business model depends on the international flow of personal data. Along with multimillion-dollar fines, supervisory authorities have the power to impose a mandatory “stop order,” requiring non-compliant data flows to cease.

Join us to learn how your organization can most effectively:

  • Understand and respond to the post-Brexit separation of EU GDPR and UK GDPR
  • Ensure that you are not affected by the increasing use of “stop orders” to prevent unlawful transfers of personal data
  • Employ cookies and other digital technologies
  • Select the appropriate legal, technical and organizational measures to protect your business-critical transfers of personal data and implement best practices for conducting personal data adequacy assessments
  • Design and implement a data-driven compliance strategy

Our program panelists, comprising members of our Data Privacy, Cybersecurity & Digital Assets Practice based in London, Madrid and Berlin, will also explain the new Standard Contractual Clauses and provide a breakdown of the IAB, Google Analytics and Cookiebot decisions.

Learn more and register here.

Google announced it will be rolling out a “Data Safety” section for apps listed on its app marketplace, Google Play, similar to Apple’s Privacy Nutrition Labels. The Data Safety section will provide consumers with a summary of an app’s privacy and security practices, including but not limited to what user data an app “collects” or “shares”. App developers (“Developers”) must complete the Data Safety form by July 20, 2022. Notably, Google has not implemented a tracking opt-in, like Apple Tracking Transparency, in association with the Data Safety initiative. As your app’s Data Safety disclosure will serve as a de facto additional privacy notice of your organization, development and product teams should consult with the legal/privacy counsel as they populate the information. Below, we provide high-level instructions on populating the Data Safety Form (“Form”) and additional Google privacy requirements. If you are interested in further information on this topic, we have detailed guidance on Google Data Safety, as well as Apple’s Privacy Nutrition Labels and App Tracking Transparency requirements, including detailed instructions on how to complete the forms (with screenshots), available for a fixed fee.   Continue Reading Google to Require Apps to Display “Data Safety” Information by July 20, 2022

On May 10, 2022, Connecticut Governor Ned Lamont signed SB 6 “An Act Concerning Personal Data Privacy and Online Monitoring” (known as the CT Privacy Act (CTPA)) into law, effective July 1, 2023.   Continue Reading Connecticut and Utah Latest States to Jump On Consumer Privacy Bandwagon

The French data protection authority, the CNIL, has published its annual report for 2021 (in French)  which contains some useful information and figures notably on complaints, investigations and sanctions as well as standards of references issued by the CNIL in relation to specific processing activities.

  1. Complaints, Investigations and Sanctions


In 2021, the CNIL received 14,143 complaints (an increase of 7% compared to 2020 but similar to 2019) out of which:

    • 1,436 relate to access rights (28% of which are employee requests);
    • 1,906 relate to a request to delete names of corporate officers from online directories;
    • 973 relate to commercial, associative and political solicitation, by email (38 %), by SMS (29 %), by mail (20 %) and by phone (13 %); and
    • Several complaints related to CCTV in the work place.

Some complaints have been transferred to another lead authority under the one stop shop and cooperation rules.

The CNIL has also received 5,882 indirect data subject action requests (the indirect action is the only one available for certain data basis such as the one for the police or secret services).

The CNIL reports that many complaints have been made about organizations that are established outside of the EU (UK, Switzerland, United States of America, Canada, Russia, Australia, South Korea and China) mainly in relation to the publication of data on the Internet.


It carried out 384 investigations, 31% of which followed from complaints or reports.

The CNIL highlights:

    • Cookie

Cookie compliance has been one of the priority themes set by the CNIL for 2021 and the CNIL has launched an unprecedented control campaign.

    • Health data

The CNIL also continued its control activities on the security of health data by investigating 30 medical analysis laboratories, hospitals, service providers and data brokers, notably in relation to COVID-19 pandemic related data. Some of these procedures are still ongoing.

    • Cybersecurity

It controlled 22 organizations, 15 of which are public with respect to the level of internet security. The investigations revealed obsolete cryptographic suites making websites vulnerable to attacks, shortcomings concerning passwords and, more generally, insufficient means with regard to current security issues.


The CNIL issued:

    • 135 formal notices; and
    • 18 sanctions for a record total amount of fines exceeding 214 million euros.

Out of the 18 sanctions,

    • 12 have been made public;
    • 15 consist of fines (5 with injunctions under penalty per day of delay);
    • 2 consist of calls to order with injunctions; and
    • 4 are decisions taken by the CNIL as a lead authority.

The most frequent breaches include:

    • Lack of information and excessive retention;
    • Lack of security; and
    • Cookies: 89 formal notices and 4 sanctions for the most serious cases of noncompliance which concerned actors who did not allow millions of internet users to refuse cookies as simply as to accept them.

The CNIL also issued two public sanctions against the Ministry of the Interior, concerning the illicit use of drones and poor management of the automated fingerprint file (FAED).

Investigation program for 2022

In February, the CNIL published  its priority focuses for investigation in 2022 investigation program, which accounts for around one third of its investigations, on the following three major topics:

    • Marketing activities/commercial solicitation

This follows the numerous complaints received on this topic and the publication in February 2022, a new “commercial management” reference framework, in particular framing the carrying out of commercial prospecting. The CNIL intends to investigate data brokers and other intermediaries.

    • Monitoring tools in the context of telework

The significant shift to teleworking has led to the development of specific tools, including tools allowing employers to ensure closer monitoring of the daily tasks and activities of employees. The CNIL considers it necessary to check the employers’ practices in this field.

    • Cloud

The CNIL intends to explore issues relating to data transfers and the management of contractual relations between data controllers and cloud solution provider subcontractors.

  1. Data breach notifications

The CNIL has received 5,037 data breach notifications (a 79% increase compared to 2020) out of which, 63% were due to an external cause (accident or malicious act). The CNIL considers that this figure is still too low compared to actual data breaches which may have occurred.

  1. Support to public authorities the legislator

The CNIL responded to 22 parliamentary hearings and issued 121 opinions on bills and decrees. 16 of these opinions concerned how data processing was implemented in the context of the fight against the COVID-19 pandemic.

  1. Decisions

The CNIL also handled 576 health authorization applications in 2021 and issued 54 research authorizations on COVID-19.

  1. Soft law and support to businesses

In 2021, the CNIL adopted several standards of reference and sectorial recommendations. These included:

    • Standards of reference relating to care, accommodation, social and medico-social support of disabled elderly persons;
    • Standards of reference relating to the designation of drivers who have committed a traffic violation;
    • Standards of reference relating to rental management;
    • Standards of reference for health data warehouses;
    • Recommendation on the exercise of data subject rights through a representative;
    • Interim recommendations for the quality control of clinical trials during the health crisis;
    • Recommendation on logging measures;
    • Draft standards of reference for the management of pharmacies; and
    • Practical recommendation in the insurance sector completing a 2014 compliance pack.

It has also developed tools to enable the development of virtuous digital innovation, in particular through its “start-up” strategy deployed in 2017. This year, this has resulted in the implementation of a first personal data sandbox for health. As a result, 12 projects have been supported by the CNIL, including 4 in a reinforced way.

  1. New sanction procedure for 2022

As of January 2022, the law has created a simplified sanction procedure that allows, most notably, the CNIL to handle a higher number of complaints. The sanctions that can be issued by the CNIL under this procedure are limited to a call to order, a fine of a maximum amount of €20,000 and an injunction with a penalty capped at €100 per day of delay. These sanctions cannot be made public.

  1. Focus for 2022-2024

The CNIL has identified three areas in which it intends to establish a position and elaborate tools before including them in the investigation program:

    • Augmented cameras and their uses

The accelerated development in the field of so-called “augmented” cameras, often coupled with predictive algorithms, raises the question of the necessary and proportionate nature of these devices and runs the risk of large-scale monitoring of people.

    • Data transfers in cloud computing
    • Collection of data collections by smartphone apps

Faced with the opacity of technologies and the heterogeneity of practices, the objective of the CNIL is to make visible the data flows and strengthen the compliance of mobile apps and their ecosystems, to better protect the privacy of smartphone users.

If you need assistance in France on data protection issues, contact

The California Privacy Rights Act (“CPRA”) places significant power in the hands of the California Privacy Protection Agency (“CPPA” or “Agency”) to influence the future of privacy regulation in the United States, including—perhaps most importantly—the authority to issue regulations in twenty-two specific, enumerated areas to achieve the broad objective of “further[ing] the purposes of” the CPRA.

As to automated decision-making and profiling, the CPRA has granted the Agency the equivalent of a regulatory blank check. In this regard, the CPRA references profiling or automated decision-making a total of two times throughout the voluminous text of the statute: first, in defining the term “profiling,” and second, in the law’s broad rulemaking mandate:

Issuing regulations governing access and opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling and requiring businesses’ response to access requests to include meaningful information about the logic involved in such decision-making processes, as well as a description of the likely outcome of the process with respect to the consumer.

For this reason, the CPPA has focused a significant amount of its preliminary rulemaking activities on automated decision-making and profiling. This focus began in the fall of 2021 when profiling and automated decision-making were included as part of nine topics on which the Agency sought public comment. In late March, the CPPA hosted informational sessions—during which time the Agency discussed automated decision-making for the majority of an entire day, including cross-jurisdictional approaches to automated decision-making and profiling under the EU’s General Data Protection Regulation.

Just last week, the CPPA held stakeholders sessions (Agenda here) over the course of three days, during which it set aside three hours in the first half of the first day for stakeholders to comment on automated decision-making. Importantly, these comments—provided by a range of stakeholders—offer key insights into some of the more complex, challenging issues that businesses will face when adapting their privacy programs to comply with the new rules and restrictions that will be placed on automated decision-making under the CPRA beginning at the start of 2023.

The comments and positions of the individuals that spoke on the topic of automated decision-making varied widely. However, there were several common, key themes reiterated throughout the session that shine a light on concerns shared by various stakeholders, as well as the tug of war between their (and others’) competing interests. The stakeholder comments also highlighted the complexity of striking a balance between regulating automated decision-making technology and profiling in a privacy-protective manner while at the same time avoiding overly restrictive regulations that would hamper innovation. Many of the comments made fell under the following themes:

  • The Type of Automated Decision-Making Activities That Should Be Regulated: Many speakers highlighted the potentially significant, unintended ramifications of an overly broad scope for the term “automated decision-making technology,” which would result in producing little benefit to consumers while at the same time greatly hampering the operations of businesses across all sectors. For that reason, many speakers emphasized the need to limit the reach of automated decision-making regulation to: (1) fully automated decision-making technology; and (2) technology that produces legal or similarly significant effects, such as those bearing on a consumer’s employment or credit; and/or (3) high risk activities, sensitive data, and/or automated decision-making that constitutes profiling. In addition, several other speakers noted the need for a requirement that the term encompasses only those activities that involve the processing of personal information (which would seem to be inherent in the CPRA regardless).
  • Consumer Rights Relating to the Use of Automated Decision-Making Technology: Speakers also frequently highlighted the need for balance as it relates to consumers’ access rights regarding automated decision-making technology. On the one hand, as many speakers suggested, the CPRA should not impose requirements on businesses to disclose information to consumers on low-risk automated decision-making technology, such as spell check or spreadsheets. On the other, the CPPA was cautioned to avoid crafting regulations that afforded access rights that would require businesses to provide detailed descriptions of complex algorithms involved in automated decision-making, as doing so would fail to provide average consumers with “meaningful” information regarding the information and logic underlying automated processing. At the same time, the required disclosure of algorithms and similar sensitive business information would also likely conflict with the right of businesses to protect their trade secrets and similar types of information.
  • Consumer Opt-Out Rights Relating to Automated Decision-Making: Many speakers shared the common concern that the significant benefits offered by automated decision-making technology to consumers and businesses alike could be severely hampered by granting consumers overbroad opt-out rights as it relates to activities that fall under the definition of automated decision-making. At a minimum, several speakers suggested, regulations relating to automated decision-making should be tethered to the CPRA’s statutory rights of access and opt-outs.
  • Alignment with the GDPR and other Regulatory Schemes: Many stakeholders, including a representative of the Future of Privacy Forum, urged that the regulations should align with GDPR Article 22. Others pointed to the EU’s pending Digital Services Act, as well as the Artificial Intelligence Act, for other schemes with which the CPRA’s regulations should be consistent.


Following the CPPA’s May stakeholder sessions, the CPPA will begin the formal rulemaking process, but final Regulations are not anticipated to be issued until sometime in early 2023. Companies should monitor for developments in the area of CPPA rulemaking to ensure they are aware of any anticipated changes in the law, which will go into effect at the start of 2023. In addition, companies should immediately begin adapting their privacy programs for compliance not only with the CPRA but also with the Colorado, Connecticut, Virginia, and Utah laws that will also come online over the course of 2023 as well.

For more information on the stakeholder sessions, including other topics discussed, you can visit the CPPA’s events page here.

Check back often for more of SPB’s and CPW’s thought leadership on the CPRA and the other 2023 state privacy laws, as well as on AI and automated decision-making. For a further discussion of the CPPA’s approach to rulemaking on automated decision-making and profiling, you can view a recording of our recent webinar 2022 Developments and Trends Concerning Biometric Privacy and Artificial Intelligence. In addition, SPB Partners Kyle Fath and Kristin Bryan will take a deeper dive into this and related topics in our June 2 webinar hosted by the International Association of Privacy Professionals (IAPP). Registration for the IAPP webinar is available here (free for IAPP members).

In Episode 2 of the four-part ‘Global Privacy Podcast’ series, Scott Warren, our APAC data privacy partner in Tokyo and Shanghai, joins host Nadia Ishaq to discuss the latest developments in China’s new data privacy and related laws: how it is like and unlike GDPR and the challenges you may face in moving personal data out of China.

Find it at on #spotify and at on #applepodcasts.

And, in case you missed Episode 1, it’s available here:



Apple Podcasts:

In case you missed it, below are recent posts from Consumer Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

NOW AVAILABLE: Lexis Practical Guidance Releases CPW Team Member David Oberly’s “Mitigating Legal Risks When Using Biometric Technologies” Biometric Privacy Practice Note and Biometric Privacy Compliance Checklist

Registration Open: CPW’s Kyle Fath and Kristin Bryan to Discuss Artificial Intelligence and Biometrics in New IAPP Virtual Event

Aerojet Rocketdyne Cybersecurity Trial and Settlement

“Dark Patterns” Are Focus of Regulatory Scrutiny in the United States and Europe

Webinar Recording Now Available: 2022 Developments and Trends Concerning Data Breach and Cybersecurity Litigation and Related Matters

Third Circuit Issues Order in WaWa Data Breach

California Privacy Regulator to Hold Stakeholder Sessions First Week of May

CPW’s David Oberly Talks to Legaltech News About the Tech Industry’s Increased Focus on Enhancing Privacy Protections Over Consumers’ Sensitive Personal Information

Indiana Amends Data Breach Notification Law

Connecticut General Assembly Passes Comprehensive Privacy Bill

Federal Trade Commission Proposes Adjustments to Telemarketing Sales Rule, Including B2b Telemarketing Calls

Law360: Anti-Bias Considerations For Finance Cos. After CFPB Notice

Court Rejects Demand for “Corrective” Notice in Blackbaud Data Breach MDL

APAC Partner Scott Warren Featured in New Multi-Part Global Privacy Podcast Series

Double Trouble: Why Organisations Need to Consider the Legal Consequences of Ransomware and DDoS Attacks

BREAKING: Federal Judge in Illinois Affirms that BIPA Extends to Information Derived from Photographs

FCC Chairwoman Announces Proposed New Rules to Combat International Scam Robocalls 

Fourth Circuit Rules on Data Privacy and Trade Secret Claims Brought in Context of Former Employee/Employer Dispute

FDA Publishes Draft Cybersecurity Guidance for Medical Devices

Badgerow v. Walters: Supreme Court Holds that “look through” Approach Does not Apply to Requests to Confirm or Vacate Arbitral Awards Under the FAA

SEC Chair Reiterates New Potential Cyber Regulations at Financial Sector Meeting

Implications of Judge Jackson’s Confirmation for Data Privacy and Cybersecurity Litigations Going Forward

Congratulations to CPW’s Kyle Dull on Being Named to the 2022 Law360 Consumer Protection Editorial Board!

FCC Seeks Letters of Intent to Serve as Traceback Consortium on Suspected Unlawful Robocalls

hiQ Labs v. LinkedIn : Ninth Circuit Indicates Tort-Based Claims, Not CFAA, Appropriate Legal Theory for Attacking Data Scraping Practices

CPW on the Speaking Circuit in April: Scott Warren to Present on Digital Crime