Federal Communications Commission (“FCC”) Chairwoman Jessica Rosenworcel announced on May 17, 2022 “new robocall investigation partnerships with the Attorneys General of Iowa, Florida, Louisiana, Maine, Massachusetts, Mississippi, Nevada, New Hampshire, and South Carolina.” In addition to these new agreements, the FCC is building on its existing robocall investigation partnership with the New York Department of State.

This means that 36 States and the District of Columbia have now signed Memoranda of Understanding to join the “FCC Enforcement Bureau to share evidence, coordinate investigations, pool enforcement resources, and work together to combat illegal robocall campaigns and protect American consumers from scams.”

https://www.fcc.gov/document/fcc-signs-robocall-partnerships-nine-more-state-attorneys-general

Last week, the Federal Trade Commission (“FTC”) held an open meeting focused on issues related to children’s privacy and those pertaining to the use of endorsements and testimonials in advertising. In the meeting, the FTC adopted a new policy statement targeting data collection practices in educational technology. Further, the FTC proposed amendments to the Guides Concerning the Use of Endorsements and Testimonials in Advertising (“Endorsement Guides”) which would target child-directed marketing. Of note, one of the amendments would recognize that children may react to advertising practices differently than adults and thus advertising practices directed towards children may be treated differently by the FTC compared to those practices directed towards adults. Continue Reading FTC Targets Children’s Privacy and Stealth Advertising Directed at Children

This week, team members from our Data Privacy, Cybersecurity & Digital Assets Practice in Los Angeles, London, Madrid and Berlin hosted: “Navigating Opportunities and Challenges: Cross-border Data, the Cookiepocalypse and Standard Contractual Clauses.”

If you missed the live webinar, don’t worry! Below you can access the full recording and watch the program.

In the session, contributors Francesco LiberatoreDavid Naylor, Malcolm Dowden, Dr. Annette Demmel, Bartolome Martin, and Kyle Fath discussed how European and UK data protection laws present significant challenges to organizations whose business models depend on the international flow of personal data, the new Standard Contractual Clauses and provided a breakdown of the IAB, Google Analytics and Cookiebot decisions. The panelists helped attendees understand how they can effectively:

  • Respond to the post-Brexit separation of EU GDPR and UK GDPR
  • Ensure they are not affected by the increasing use of “stop orders” to prevent unlawful transfers of personal data
  • Employ cookies and other digital technologies
  • Select the appropriate legal, technical and organizational measures to protect business-critical transfers of personal data and implement best practices for conducting personal data adequacy assessments
  • Design and implement a data-driven compliance strategy

The full webinar recording is available here and the presentation slides can be found here.

Thank you again to all who had the opportunity to attend. We look forward to hosting you at our next event.

Following on from a string of cases in 2021 concerning minor data breaches (see our earlier article here), two further cases in Q1 of 2022 have continued the trend of High Court scepticism. Such compensation claims, usually involving multiple causes of action, often find themselves trimmed down and sent to the County Court, if not struck out entirely.

In our review below, we shed some light on the judiciary’s attitude towards opportunistic claimants.

Condemnation of the “kitchen sink” approach – William Stadler v Currys Group Limited [2022] EWHC 160 (QB)

Whilst claimants continue to pile up multiple causes of action in data breach compensation cases, presumably in the hope of increasing their prospects of a successful recovery, this approach appears to have the opposite effect, as the claimant in this case discovered.

Mr Stadler purchased a Smart TV from Currys in September 2016. Mr Stadler logged into various apps, including Amazon Prime, but returned the TV to Currys in September 2020 for repair. Mr Stadler was not asked to wipe the data from the TV before returning it and did not log out of any apps before leaving it with Currys. Repairing the TV was considered too costly, so Currys wrote it off and sold it to a third party company without having wiped the data from it. Someone subsequently purchased a film from Stadler’s Amazon account.

Mr Stalder telephoned Currys, who reimbursed him for the cost of the film and ensured that he had logged out of all apps and changed the password. Currys also gave Mr Stadler a £200 shopping voucher as a gesture of goodwill. Nonetheless, Mr Stadler went on to issue proceedings, alleging misuse of private information, breach of confidence, negligence and breach of data protection laws (Article 82 of the UK GDPR and the Data Protection Act 2018). He claimed aggravated and exemplary damages up to £5000, as well as an injunction requiring compliance with the data protection law in question and a declaration that the data processing had breached Article 5(1) of the GDPR.

Currys applied for strikeout on the basis that Mr Stadler had no reasonable grounds for making a claim, pointing out that he had already been compensated and that it would be an abuse to allow him to proceed with costly litigation in such circumstances.

The judge found that Mr Stadler had not pleaded his case adequately – the confidential information in question was not properly identified, nor was the obligation of confidence, and it was unclear what constituted the alleged misuse given that Currys had not taken any positive action to leak the data and in fact had no actual knowledge of the misuse of the data. A failure to wipe the device was insufficient for either a breach of confidence or misuse of private information claim. Mr Stadler’s negligence claim failed because there was no actionable harm; Currys had already reimbursed Mr Stadler financially and distress was insufficient for negligence without a resulting recognised psychiatric illness. In addition, there was no need to impose an additional duty of care when data protection legislation already imposed an adequate duty.

As such, the judge only allowed the claims under data protection legislation to continue and struck out the claims for misuse of private information, breach of confidence and negligence. The judge was heavily critical of Mr Stadler’s strategy, saying that these multiple causes of action ‘increased the complexity of the proceedings unnecessarily’, that it was “difficult to understand on what basis the claimant could have sought to recover aggravated and exemplary damages, nor the purpose of the injunction” and that “[T]hese claims appear wholly misconceived and without merit”.

Directing the remaining data protection legislation claim be transferred to the County Court (in keeping with previous recent decisions in data breach cases, see our earlier article on this here) he commented “This is a very low value claim. Consumer disputes of equivalent complexity are heard every day in the County Court on the small claims track and do not need to be dealt with by a High Court Judge.

There now appears to be a general consensus among High Court judges that minor data breach cases are well suited to the County Court and capable of being dealt with on the small claims track. This recognition may, in time, lead to a reduction of small data breach claims being brought on a no-win, no-fee basis given the limited ability to recover legal costs in the small claims track arena.

Sympathy only gets you so far – Underwood & Anor v Bounty UK Limited and Hampshire Hospitals NHS Foundation Trust [2022] EWHC 888 (QB)

Even for claimants with whom the judge sympathises, a sense of proportionality is applied. This claimant had even less success than Mr Stadler and was not allowed to proceed with the data breach claim at all.

The first defendant in this case, Bounty UK, was a ‘pregnancy and parenting support club’, providing expecting and new parents with information packs and other services, and (until 2018) supplying data to third parties for electronic direct marketing. Bounty had previously been investigated by the Information Commissioner’s Office (“ICO”) in 2017 and 2018 in relation to its practice of collecting records of parents’ full name, date of birth, email address, postal address, pregnancy status and status as a first time mother, along with the name, gender and date of birth of their baby. These records were then shared with 39 organisations based on consent allegedly received during the registration process. The ICO held that this consent was not informed and that data subjects could not have foreseen that their data would be shared with third party organisations, and Bounty were therefore fined £400,000. As the judge put it at paragraph 11 of the judgment, ‘Bounty’s business model was largely based upon harvesting data from expectant mothers in order to sell that data on to third parties’.

Bounty had a distribution agreement with the second defendant hospital to enter and distribute “Bounty Packs” to new mothers, as well as providing a photography service. These services would encourage new mothers to sign up on the Bounty app, enabling Bounty to sell their data onwards. The hospital terminated its agreement with Bounty following the ICO investigation. The events of Underwood took place prior to this termination.

The claimants were a mother who had given birth in the second defendant hospital and her child. The mother had signed up on the Bounty app, which would have required her to provide her name, hospital number and address. A Bounty representative approached the claimants at their hospital bedside shortly after the birth in a way that was unwelcome to the mother. Later on, the mother received random telephone calls and emails from third party companies and suspected Bounty of having passed on information about herself and her newborn child, most likely obtained by looking at documents at the end of the hospital bed.

Proceedings were issued and general, aggravated and exemplary (i.e punitive) damages were claimed for breach of the Data Protection Act 1998 (“DPA 1998”)and/or misuse of private information. The mother accused Bounty of accessing data about herself and her child from medical information at the end of the bed and also accused the hospital of allowing this to happen.

As Bounty had subsequently gone into administration, the subject of the trial was whether or not the hospital was liable for the obtaining by Bounty of the information/data and, therefore, the breach/misuse.

The judge concluded that the hospital was not liable for the Bounty representative’s unauthorised access of the information. As in Stadler (above) and Warren v DSG (here), there was no “misuse” of the information because the information was obtained without the hospital’s consent or knowledge, Bounty having signed up to a Code of Conduct imposed by the hospital which emphasised the need to respect expectant mothers’ privacy. There was also no “processing” of data by the hospital in contravention of the DPA 1998 and a suggestion that the hospital had failed to take appropriate technical and organisational measures to prevent unauthorised access to the medical charts at the foot of the bed was rejected on the grounds that this would impose an inappropriate and unnecessary requirement for all patient data to be strictly withheld when access to it was needed so that the hospital could carry out its function of providing medical care. Both the claim for misuse of private information and the breach of the DPA 1998 were therefore dismissed, the ‘real wrongdoer’ being Bounty rather than the hospital.

Whilst the judge stated that he appreciated the decision would be a ‘disappointment’ for the Underwood family and could ‘certainly understand’ why they felt that their data had been exploited, he also made two further comments which, albeit obiter, will undoubtedly prove useful to those seeking to reject compensation claims in respect of minor breaches. First, he considered that even if the hospital had been liable, the actual data which was accessed unlawfully by Bounty in the hospital (i.e. the baby’s name, gender and date of birth) was not serious enough to engage the tort of misuse of private information. Second, and in terms of the remedy sought, he stated that it is ‘never appropriate to add a claim for exemplary damages simply to mark how upset the claimant is about the defendant’s conduct, or as some sort of negotiating strategy’.

This case therefore shows that even in situations creating great emotional sympathy for the claimant, unauthorised access of data by a third party  does not automatically lead to a successful claim against the data controller – although the obiter comments about Bounty suggest that the outcome would be different if the claim was brought against the third party wrongdoer.

Provided that appropriate organisational and technical measures are in place, data controllers can now be relatively comfortable that the High Court will continue to treat opportunistic data breach claimants with suspicion. Claims that throw in unnecessarily numerous causes of action and claim damages for vague losses that do not amount to a recognised psychiatric illness are likely to be, at worst, banished to the County Court small claims track, or at best struck out entirely.

The California Privacy Protection Agency (“CPPA”) will host its next public meeting on Thursday, May 26, 2022 at 11AM PT. Members of the public may attend in person or virtually by following these instructions. CPPA Director Ashkan Soltani will provide an update on the CPPA’s hiring, budget, and rulemaking activities.  Importantly, subcommittees will provide more information on the course of action for the upcoming rulemaking process as well as information regarding the anticipated rulemaking draft.

In February, the CPPA expressed its strategy to host informational preliminary hearings in order to ensure that the rules they adopt adequately address the most prevalent issues in consumer privacy, and anticipated that the rulemaking process, including formal period public hearings, would commence in the third quarter and continue into the fourth quarter of 2022. Earlier this month, the CPPA held a pre-rulemaking stakeholder session during which it heard public comments on automated decision-making, with most comments focusing on: (1) the type of automated decision-making activities that should be regulated; (2) consumer rights relating to the use of automated decision-making technology; (3) consumer opt-out rights relating to automated decision-making; and (4) alignment with the General Data Protection Regulation and other regulatory schemes.

Although final Regulations are not anticipated until sometime in early 2023, the California Privacy Rights Act amendments to the California Privacy Protection Act (“CCPA”) will go into effect in January 2023. Businesses should therefore monitor CPPA rulemaking activities to ensure they are aware of how the lead CCPA enforcement agency interprets the CCPA’s requirements, and to glean insight into the agency’s potential enforcement priorities

Kyle Dull, Senior Associate (New York/Miami), will chair a panel at the International Institute of Communications’ Annual LatAm & Caribbean Telecommunications & Media Forum on May 20, 2022 at 11:00 AM (ET) on the topic “Artificial Intelligence and Data Governance – Regulatory Initiatives and Considerations.” The panelists include,

Carlos Rebellón, Director Americas, Mexico & Canada, Global Government Affairs, Intel

Vitelio Ruiz Bernal, General Director of Investigation and Verification of the Private Sector, National Institute of Transparency, Access to Information and Protection of Personal Data (INAI), Mexico

Ángel Melguizo, Independent Consultant – representing UNESCO

Dr. Nicholas F. Tsinoremas, Vice Provost for Research Computing and Data, Institute for Data Science and Computing, University of Miami

To sign-up, register at IIC’s event page.

Now in its 9th year, the IIC’s annual Latin America & Caribbean Telecom & Media Forum will once again be in-person, in Miami. Regulators, policy makers, industry and civil society from South & Central America and the Caribbean will gather to discuss the challenges and opportunities posed by digital transformation.

The Federal Trade Commission (“FTC”) announced its next open meeting will focus on issues related to children’s privacy and those pertaining to the use of endorsements and testimonials in advertising. Continue Reading FTC to Discuss Children’s Privacy, Endorsement Guides at Next (Virtual) Open Commission Meeting: May 19, 2022, 1PM ET

In case you missed it, below are recent posts from Consumer Privacy World covering the latest developments on data privacy, security and innovation. Please reach out to the authors if you are interested in additional information.

CJEU Rules Consumer Associations Can File Data Infringement Class Actions Without a Consumer Mandate

CPW’s Scott Warren Joins Faculty of PrivSec Focus: Enterprise Risk | Consumer Privacy World

German Supervisory Authorities: Online Traders Must Allow Guest Access for Customer Orders | Consumer Privacy World

BREAKING: FTC Nominee Bedoya Confirmed | Consumer Privacy World

A Do-Not-Miss CLE Webinar: Cross-border Data, the Cookiepocalypse and Standard Contractual Clauses | Consumer Privacy World

Google to Require Apps to Display “Data Safety” Information by July 20, 2022 | Consumer Privacy World

Connecticut and Utah Latest States to Jump On Consumer Privacy Bandwagon | Consumer Privacy World

Noteworthy Information in the French Data Protection Authority‘s (CNIL) Newly Published 2021 Annual Report | Consumer Privacy World

California Privacy Protection Agency Continues Rulemaking Focus on Automated Decision-Making and Profiling in Stakeholder Sessions | Consumer Privacy World

Episode 2 Out Now: APAC Partner Scott Warren Discusses Data Privacy Laws in China | Consumer Privacy World

NOW AVAILABLE: Lexis Practical Guidance Releases CPW Team Member David Oberly’s “Mitigating Legal Risks When Using Biometric Technologies” Biometric Privacy Practice Note and Biometric Privacy Compliance Checklist | Consumer Privacy World

CJEU Rules Consumer Associations Can File Data Infringement Class Actions Without a Consumer Mandate | Consumer Privacy World

Registration Open: CPW’s Kyle Fath and Kristin Bryan to Discuss Artificial Intelligence and Biometrics in New IAPP Virtual Event | Consumer Privacy World

Aerojet Rocketdyne Cybersecurity Trial and Settlement | Consumer Privacy World

“Dark Patterns” Are Focus of Regulatory Scrutiny in the United States and Europe | Consumer Privacy World

Webinar Recording Now Available: 2022 Developments and Trends Concerning Data Breach and Cybersecurity Litigation and Related Matters | Consumer Privacy World

Third Circuit Issues Order in WaWa Data Breach | Consumer Privacy World

California Privacy Regulator to Hold Stakeholder Sessions First Week of May | Consumer Privacy World

Article 80 (2) of the General Data Protection Regulation (GDPR) provides that Member States can entitle properly constituted not-for-profit bodies, organizations or associations that have statutory objectives which are in the public interest, and are active in the field of the protection of data subjects’ rights and freedoms, with the right to lodge complaints with the supervisory authority when they consider that the rights of the data subjects have been infringed.

This group of entities includes consumer associations, which have the right throughout the EU to lodge complaints on behalf of their members. It was not that clear, however, whether they needed a mandate from those members, in this case, consumers, to be able to lodge such complaints. This is precisely what the Court of Justice of the European Union (CJEU) has clarified in its decision of 22 April 2022 in case C‑319/20.

The question at issue was whether the Federation’s legal standing under German law to bring actions for injunctions based on a data protection infringement in the name of consumers violates the provisions of Art. 80 of the GDPR.

The Court explains that both the personal and the material requirements contained in Art. 80 GDPR are satisfied. To the extent that the Federation pursues a public interest purpose consisting of guaranteeing the rights and freedoms of data subjects as consumers (without prejudice that this purpose may be linked to the protection of their personal data), the personal requirement is fulfilled. Concerning the material requirement, as it is only required that the entity concerned ‘considers’ that the rights of a data subject under the GDPR have been infringed, it is also deemed to have been satisfied.

The bottom line of the decision is that Art. 80(2) GDPR does not preclude national provisions that empower consumer associations without a mandate for that purpose to bring class actions to ensure compliance with the rights conferred by the GDPR by means of rules designed to protect consumers or to prevent unfair commercial practices.

The Court adds that in order to bring such class actions, the entity representing the data subjects cannot be required to carry out the prior individual identification of the person who specifically has the status of data subject affected by the data processing allegedly against the provisions of the GDPR. It is neither necessary to allege a concrete breach of the rights conferred by the data protection rules or the existence of actual damage suffered by the data subject as a result of the infringement of his rights.

The Court concludes that this interpretation is consistent with the objective pursued by the GDPR: to ensure effective protection of the freedoms and fundamental rights of natural persons and the reinforcement of the rights of data subjects by providing them with a high level of protection.

The question is perhaps whether or not this decision will be a turning point for consumer associations and whether or not they will follow the Federation’s lead and proactively pursue data protection breaches on their radar without a mandate from their members.