Hi friends

Eric J. Troutman here, mythical (or is it mystical?) Czar of the TCPAWorld.

It is no secret that I’ve been excited to expand our offerings beyond the TCPA–and the fact that SCOTUS may strike down the TCPA at any minute has a little something to do with it. Ha.

But in truth, the pursuit of this new legal WORLD to explore was driven by YOU, my esteemed and splendid readers and friends.

How many of you have asked at one of my many, many, many speaking engagements over the years a cross-over question regarding CCPA or data privacy? Indeed every company interested in TCPA is–to some degree or another–interested in data security and applicable law. (I even did a webinar on this once–and I hate webinars.)

How many of my dear clients have sought guidance on the FCRA–noting the complete lack of ANY meaningful internet resource on the subject? (For shame internet!)

And of course BIPA–who had ever even heard of that statute before Jay Edelson’s huge interview on my podcast last year? I don’t see many hands out there. And that’s because the phenomenon of BIPA litigation is taking root right before our very eyes.

All three of these areas of law–along with the alphabet soup of enactments like CIPA, SCA, ECPA, and yes even HIPAA (shy wonderful HIPAA)– are fast-paced and developing. They need attention and meaningful analysis by real privacy lawyers steeped in this stuff and from a firm with the resources to devote to tracking case law developments and spotting trends in real time–as they develop.


Well, because you’ve asked for it, that’s why. And so we delivered.

For those of you familiar with TCPAWorld.com–and you all are aren’t you?–we take the mission of chronicling and exploring case law and related developments incredibly seriously, but we don’t take ourselves too seriously. Pretense is dull. So are barriers to content. Plus lawyers often hide behind legalese when they don’t really understand what they’re trying to say. (But I’m not telling you anything you don’t already know now am I?)

None of that here.

We’ll review all the case law and give you exactly what you need to know, and we’ll try to do it in a way that is light-hearted and relatable. At times–dare I say–even entertaining (although some of us are better than that than others. Ha.)

Our formula is simple– if something happens out there in the wide world of consumer privacy law, we want to give it to you straight and as immediately as possible. You need to know this stuff right now–not days or *cough* weeks later. And you don’t want gobbledy gook or nonsense. We get it.

More than that, you want  to trust that you can rely on what you read and you want a single resource that will comprehensively cover the law that matters most to you–from all angles.


Squire Patton Boggs has assembled its truly amazing team of privacy lawyers–I mean look at this team– and spared no resource to assure that consumerprivacyworld is exactly what you need it to be– timely, smart, engaging, analysis you can work with and learn from.

So welcome to your new privacy law wonderland! Please do make it YOUR wonderland. If you have questions or thoughts on how we can improve–reach out. Don’t like an article or disagree with some analysis? Let us know. And of course if you actually do like something you see here–tell us. We want to know how to make your experience on consumerprivacyworld.com as useful as possible.

We sincerely hope you’ll enjoy your stay and take your time to appreciate everything the website has to offer over time (we’ll be rolling out new features shortly–don’t worry if it feels a bit Spartan in the short term, bells and whistles and a merry-go-round will be installed shortly.)

It is great to have you here. Enjoy–and tell a few dozen pals.

Thanks friends. Chat soon.

Yesterday, the Seventh Circuit weighed in on the critical issue of whether a plaintiff bringing a data privacy action – this time, under Illinois’ Biometric Information Privacy Act (“BIPA”) – has Article III standing to sue in federal court.  In a twist that civil procedure buffs will love, Plaintiffs claimed that they did not have standing (because they preferred to litigate in state court), while defendant claimed the Plaintiffs had valid standing.  Ultimately, Plaintiffs won the day, and the Seventh Circuit affirmed the district court’s remand to Illinois state court.[1]

In Thornley v. Clearview AI, No. 20-3249, 2021 U.S. App. LEXIS 1006 (7th Cir. Jan. 14, 2021), Plaintiffs initially filed suit in Illinois state court against Clearview, alleging violations of BIPA §§ 14/15(a), (b), and (c).  Clearview removed the case to federal court, and Plaintiffs voluntarily dismissed, only to bring another claim against Clearview in Illinois state court.  This time, Plaintiffs alleged only a violation of BIPA § 15(c), with a narrower proposed class definition.  When Clearview promptly removed to federal court again, Plaintiffs filed a motion to remand, alleging (under Spokeo) that the BIPA violation alleged in the complaint was a “bare procedural violation, divorced from any concrete harm” that did not support Article III standing.  In general, a plaintiff must show that three things are true to establish Article III standing, which is necessary for suit in a federal court: (1) that he or she suffered an injury in fact that is concrete, particularized, and actual or imminent, (2) that the injury was caused by the defendant, and (3) that the injury would likely be redressed by the requested judicial relief.  The Thornley court focused only on the first factor (the “injury in fact requirement”).

The Thornley court first turned to relevant Seventh Circuit precedent in which it had already examined the issue of Article III standing under BIPA, pointing back to its decisions in Miller v. Southwest Airlines Co., 926 F.3d 898 (7th Cir. 2019); Bryant v. Compass Group USA, Inc., 958 F.3d 617 (7th Cir. 2020); and Fox v. Dakkota Integrated Systems, LLC, 980 F.3d 1146 (7th Cir. 2020).  It did so to underscore two points: first, that “allegations matter,” specifically the particular allegations that each plaintiff brings, even in cases that are factually similar or brought under the same statutory provision; and second, that Article III must be satisfied regardless of the type of violation that is alleged (procedural or substantive).

The Thornley complaint alleged a violation of BIPA § 15(c), which specifies that companies may not profit from individuals’ biometric information, and conceded that no class member suffered any injury aside from that statutory violation (the inclusion of their biometric information in Clearview’s database).  In evaluating the removal to federal court, the district court found that these particular allegations did not show concrete or particularized harm.  The Seventh Circuit agreed, noting that its decision would be different if the allegations had been different – for example, if the named plaintiff had alleged that, by selling her data, defendant had amplified an invasion of privacy, or deprived her of her own opportunity to profit from her data, that could have potentially satisfied Article III standing.  As the complaint was written, though, it did not allege any particularized injury.

The Thornley court also considered whether the fact that Plaintiffs had brought a class action – in which class members might potentially have suffered more particularized harm – would defeat the remand to state court.  It distinguished Thornley from Standard Fire Insurance Co. v. Knowles, 568 U.S. 588 (2013), a case in which a plaintiff sought remand to state court with a stipulation that he and the class would not seek more than $5,000,000 in damages; the Supreme Court found that such a stipulation could not be binding on class members.  Thornley differed because the issue turned on Plaintiffs’ class definition, and because “the plaintiff controls her own case,” Plaintiffs had the right to file a complaint with a narrower class definition than they could have otherwise sought.  The Seventh Circuit noted that other putative plaintiffs were free to bring their own class actions against Clearview with broader class definitions.

Ultimately, the Seventh Circuit acknowledged that it was “no secret” that Plaintiffs had taken care with their allegations and proposed class definition to “steer clear of federal court,” but noted that, “in general, plaintiffs may do this.”  Plaintiffs were permitted to “take advantage” of the fact that BIPA permits suit for solely statutory violations.

In a concurrence, Judge David Hamilton opined on the different outcomes of Seventh Circuit decisions on standing for private plaintiffs bringing suit under consumer protection statutes, observing, “I confess that I have not yet been able to extract from these different lines of cases a consistently predictable rule or standard.”  Judge Hamilton noted that Spokeo may have “raised more questions than it answered” in its finding that standing requires concrete injury, but that intangible injuries may sometimes be considered concrete.  As a result, he observed that some Seventh Circuit decisions may have “take[n] Spokeo too far” in denying standing for private plaintiffs alleging substantive violations of consumer protections statutes.

Thornley is the latest in a slate of federal court decisions on Article III standing in the context of data breach litigation, but is a win for class action plaintiffs looking to litigate BIPA claims in Illinois state courts.  Defendants seeking removal to federal court in data privacy actions should take special notice of the specific allegations they are contesting.  As Thornley demonstrates, those allegations can make or break your removal case.

[1] As many CPW readers know, BIPA regulates the storage and sale of biometric data, and offers consumers protections of that data, including the right to sue a business that fails to comply with BIPA’s requirements.  CPW readers will also be familiar with Clearview AI, a manufacturer of software that “scrapes” pictures from social media sites and sells access to a database of these pictures to clients, many of whom are law enforcement agencies.

Here’s a common scenario:  You discover a potential compliance issue and worry about being sued.  You hire outside counsel to help prepare for litigation.  Trial counsel in turn hires a consulting firm for the express purpose of helping in its litigation efforts by preparing a report addressing how the breach happened, its effects, and how to prevent another breach.  Nothing too unusual, right?

Here’s the catch:  if “the Report, or a substantially similar document, would have been created in the ordinary course of business irrespective of litigation” it may not be privileged after all.  Applying this rule, a federal court in Washington, D.C. just held that a Report prepared for trial counsel as well as the Report’s associated materials are not privileged and must be produced to plaintiffs.  See Wengui v. Clark Hill, 2021 U.S. Dist. LEXIS 5395 (D.D.C. Jan. 12, 20201).  While Wengui involves a cyber breach, its reasoning applies to any compliance-related investigation.

Read on below.

After Clark Hill discovered a cyber breach, it contacted its cybersecurity vendor to investigate and remediate the attack to preserve business continuity.  Clark Hill separately retained counsel in anticipation of litigation and counsel, in turn, engaged a separate team from a different consultant “to inform counsel about the breach so that [counsel] could provide legal advice and prepare to defend [Clark Hill] in litigation.”  Id. at *9.  As expected, litigation ensued.  During discovery, Clark Hill produced the documents related to its cybersecurity vendor’s work, but claimed the Report prepared for counsel was classic attorney work-product.  Clark Hill also argued the Report was subject to the attorney-client privilege.

The district court disagreed.  Carefully examining the record, and after conducting an in camera review of the Report, the court determined the Report was in fact an “ordinary course” incident report and ordered its production to plaintiffs.  As the court explained, for many entities, “discovering how [a cyber] breach occurred [is] a necessary business function regardless of litigation or regulatory inquiries.”  Id. at *6 (emphasis added).

In asserting that work-product privilege extended to the Report, Clark Hill argued the Report was shielded from disclosure because it was the result of one part of a “two-tracked investigation” of the cyberattack.  As Clark Hill explained, in the wake of the breach it:

(1) Retained its “usual cybersecurity vendor” to “investigate and remediate the attack” for purposes of business continuity; and

(2) On an entirely separate track, had its outside litigation counsel retain a security consulting firm “for the sole purpose of assisting [the firm] in gathering information necessary to render timely legal advice.”

Id. at *8 (emphasis added).  Clark Hill argued this was congruent with the approach followed in the well-publicized Target data breach litigation, whereby the “latter investigation and report would apparently not have existed but for the prospect of litigation, even as the other report would have been prepared ‘in the ordinary course of business.’”

The court concluded that this so-called two-track story “finds little support in the record.”  These facts were persuasive to the court in reaching this determination:

  • Clark Hill offered no “sworn statement” that the firm’s cybersecurity vendor conducted a separate “investigation” for the purpose of ascertaining the root cause of the data breach or responding thereto;
  • Clark Hill’s interrogatory answers stated that its understanding of “the progression” of the cyberattack was “based solely on the advice of outside counsel and consultants retained by outside counsel”;
  • There was no evidence that the firm’s cybersecurity vendor ever produced any findings (let alone a comprehensive report) regarding the data breach; and
  • Emails suggested that two days after the cyberattack began Clark Hill turned to the security consulting firm “instead of, rather than separate from or in addition to” the regular cybersecurity vendor to do the necessary investigative work.

Id. at *8-12 (emphasis in original).

It did not help Clark Hill’s argument that the Report was not just shared with outside and in-house counsel, but also with Clark Hill’s leadership and IT teams, as well as the FBI.  As the court observed, “[t]he Report was probably shared this widely … because it ‘was the one place where [Clark Hill] recorded the facts’ of what had transpired.”  Id. at *12.

All compliance officers and outside counsel should heed this observation from the court:  “Although Clark Hill papered the arrangement using its attorneys, that approach ‘appears to [have been] designed to help shield material from disclosure’ and is not sufficient in itself to provide work-product protection.”  Id. at *13 (emphasis added).

The court also rejected Clark Hill’s assertion that the attorney-client privilege shielded the Report regarding the data breach from disclosure.  The court explained that attorney-client privilege must be “applied narrowly,” to prevent its scope from encompassing “all manner of services” that should not be excluded from litigation.

Finally, the court also ordered Clark Hill to respond to written discovery concerning information about the scope of the cyberattack and its impact (if any) on firm clients other than plaintiff.  According to the court, this information was relevant as it pertained to a central issue in the case—the adequacy of Clark Hill’s cybersecurity.  For example, the court noted, if the attack largely targeted plaintiff’s personal information, it might suggest that Clark Hill should have taken additional “special precautions” in regards to plaintiff’s data.  Moreover, the court also found that Clark Hill generally “represents those individuals, and the fact of representation itself” does not qualify as attorney-client privilege.  This was because Clark Hill had not shown that in any particular instance a client’s identity was intertwined with the client’s confidences.

The Wengui decision underscores that while reports prepared for and at the request of counsel in anticipation of litigation can of course be privileged, compliance officers and counsel must be scrupulous to avoid blurring the lines between “ordinary course” reports and reports genuinely prepared for trial counsel for the purposes of assisting counsel in litigation.

For more developments concerning data privacy litigation as they occur in real time, stay tuned.  CPW will be there.

As a litigator, there’s nothing more important than pleading your case – and a recent case from the Eastern District of Pennsylvania reminds us that in litigation, more often than not, there are no chances for do-overs.

In Kelly v. Realpage, Inc., No. 2:19-cv-01706-JDW, 2021 U.S. Dist. LEXIS 842 (E.D. Pa. Jan. 5, 2021), plaintiffs brought a putative class action alleging that Realpage had violated the Fair Credit Reporting Act (FCRA) in collecting and disclosing information about eviction proceedings.  The court, however, denied plaintiffs’ motion for class certification.  In its Memorandum Opinion and Order, the court found that plaintiffs’ proposed classes were not ascertainable, as there was no easy way to tell which potential class members’ file contained a public record and could fall within the class, and that individual inquiries predominated over current issues.  Two weeks after the court issued the opinion, Plaintiffs moved for reconsideration based on three purportedly “new” pieces of evidence: excerpts of two depositions, taken in February 2020 and November 2020, and a declaration from December 2020.

With respect to the February and November depositions, the court found that the evidence was not new.  The February deposition took place months before the parties briefed Plaintiffs’ class certification motion, and Plaintiffs had even included another excerpt of the declaration in their class certification motion – so it was not “new”.  The November deposition was also taken before the court ruled on the motion, and the court explained that Plaintiffs had ample opportunities to put that evidence before the court, including by filing a supplemental brief.  Because Plaintiffs had both the February and November depositions available before the court ruled, the court found that they could not be considered “new” for the purposes of a reconsideration motion.  Plaintiffs had previously had the opportunity to put both pieces of evidence before the court, but had made the choice not to.

The court also considered the December declaration, which Plaintiffs had obtained after the court had already ruled on class certification – but Plaintiffs did not explain why they had waited until after the ruling to obtain the declaration.  The court noted that the issue that was the topic of the declaration – whether a manual search would be needed to determine class members – had been disputed in the briefing on the class certification motion, and Plaintiffs could have obtained the declaration then.  Instead, as the court put it: “they got [the] declaration so that they could respond to the Court’s ruling. But that’s not how litigation works.”

Because the Kelly court found that the evidence was not “new”, it also denied Plaintiffs leave to file a new class certification motion, finding that there was not “good cause” to amend the parties’ scheduling order.  The court noted that scheduling orders are intended to keep cases moving forward, which could not happen “if parties could file renewed motions any time they thought of a way to address a court’s decision.”

Kelly also demonstrates the critical importance of following the local rules, which can make or break a case.  Plaintiff had argued that class members could be identified administratively with the aid of computers – but did so in a footnote.  The court found that Plaintiffs had not satisfied their burden in arguing the point and noted that the judge’s policies and procedures explained that the court would not consider substantive arguments raised in footnotes – so even if the argument had been properly supported, the court would not have considered it.

As Kelly demonstrates, the applicable rules of civil procedure can have just as much of an impact on the outcome of a case as the merits.  Your time before the court is valuable – and limited.  We’re prepared to help you make the most of it at SPB.  In the meantime, we’ll keep an eye on cases like Kelly for you.

According to many plaintiffs in recently filed data breach litigations, credit and debit card fraud is a growing problem.  It’s great if this sounds familiar to readers of CPW, because it should:  last year, we discussed a class action lawsuit filed by a group of credit unions against a Pennsylvania-based convenience store chain alleging a data breach disclosed sensitive consumer information.  That case was In Re: Wawa Inc. Data Security Litigation, No. 2:19-cv-06019 (E.D. Pa.).  While an opinion in Wawa’s motion to dismiss remains pending, a sister Pennsylvania court recently issued an opinion that may offer a preview of how some courts recognize a duty upon acceptance of a consumer’s electronic payment information.  In In re Rutter’s Data Sec. Breach Litig., 2021 U.S. Dist. LEXIS 761 (M.D. Pa. Jan. 2021), the court addressed a motion to dismiss in the context of litigation regarding an alleged data breach at another Pennsylvania-based convenience store chain.  Rutter’s presents a number of takeaways for emerging case law, especially for its interpretation of Pennsylvania tort law, which was a key issue in Wawa.

Let’s take a look at the underlying factual allegations at issue.  Rutter’s, in contrast to Wawa, was not a lawsuit filed by credit union.  Instead, the plaintiffs were four consumers that alleged they used their debit or credit cards to purchase items from the defendant around or during the time of an alleged data breach.  The plaintiffs filed a class action lawsuit against Rutter’s, a central Pennsylvania convenience store chain.  According to the complaint, in early 2020, the defendant disclosed a possible data breach that concerned payment cards used at the various point-of-sale devices installed throughout some of its locations.

The four plaintiffs can be separated into two different groups, each of which are discussed below.  The first group alleged they experienced fraudulent charges and unauthorized withdrawals from their account because of the alleged data breach.  They also alleged expenses incurred from securing their accounts against further fraudulent activities.  The second group, however, did not allege unauthorized access to their accounts.  Instead, they alleged merely a “continuing interest” in protecting their accounts from fraud and argued this “interest” was heightened or otherwise more legitimate than a passing concern because the first group of plaintiffs alleged fraudulent activity.

On January 6, 2021, the court issued its decision denying in part and granting in part defendant’s motion to dismiss.

First, the court rejected the second group of plaintiffs’ “continuing interest” argument and found those plaintiffs did not have standing.  The court examined relevant case law and concluded, “[t]he Third Circuit was unequivocal—where a plaintiff suffers no actual injury in a data breach, that plaintiff cannot rely on the mere possibility of future injury to establish standing.”  Additionally, the court rejected the second group’s argument that the injuries alleged by the first group should be transmitted to the second group for standing purposes.  Specifically, the court stated, “Plaintiffs’ argument would require us to grant standing to a plaintiff who is entirely without an injury based solely on the injuries allegedly suffered by a separate plaintiff.  That we cannot do.”

Second, the court upheld claims for negligence, implied breach of contract, and unjust enrichment.

For the negligence claim, the court found the defendant created a legal duty when it retained the credit and debit card information of its consumers that used such payment methods because this use created a risk of foreseeable harm from unscrupulous third parties.  This ruling expands the context where a duty might lie beyond that previously recognized in recent Pennsylvania state court decisions–employer’s duty to its employees to “use reasonable means” to protect sensitive information that employees are required to disclose as a condition of employment.

As we discussed in our 2020 Year in Review, in response to the absence of uniform causes of action in data breach litigation, one strategy frequently utilized by plaintiffs in data breach litigations has been to allege negligence claims.

The Rutter’s court stated, “a more general principle … has significant applicability here—that in new factual scenarios, a court need not undertake the burdensome task of carving out new legal duties, but, instead, courts can and should apply longstanding duties where possible.”  The court stated, “in other words, affirmative conduct associated with an increased risk of harm can yield a special relationship for tort purposes.”  The court then stated this defendant’s “affirmative act of retaining credit and debit card information” was sufficient to recognize a legal duty when it created a “risk of foreseeable harm from unscrupulous third parties”.

While it upheld the breach of implied contract claim, the court clarified the claim “may not ultimately succeed”.  According to the opinion, the plaintiffs alleged they entered into an implied contract with the defendant when they provided their payment card information in exchange for the defendant’s goods and services.  Through those transactions, the defendant allegedly “impliedly promised to safeguard their card information,” as evidenced in part by the representations in the defendant’s privacy policy.

The court agreed with plaintiff, going so far as to state, “the context in which a consumer entrusts data to a merchant may be more suggestive of a promise to secure that data than in an employer-employee relationship.”  The court based its reasoning on the limited nature of the merchant and consumer relationship, clarifying:

The merchant and consumer are engaged in a momentary transaction that features all sorts of unspoken assurances between the parties—that the goods sold are as advertised and that the tender paid is valid, for example . . . When a customer provides financial information to a merchant, however, the customer could fairly assume that the data is for a single, limited purpose and thus the information will not be unreasonably exposed to third-parties; in other words, that the data will be used to complete a transaction and nothing more.

Finally, the court upheld the unjust enrichment claim on the theory that the plaintiffs conferred a material benefit to the defendant by paying funds for merchandise, a part of which was supposed to be used to employ adequate data privacy infrastructure.

Rutter’s puts another piece into the puzzle of standing in data breach litigation and offers a look into how a duty may be created upon acceptance of a consumer’s electronic payments.  As we await for a decision from its peer case, Wawa, Rutter’s offers a potential look at developing case law trends.

CPW has previously covered how companies can proactively use binding arbitration agreements to manage litigation risk-including in the context of data privacy litigation.  But as a biometric software developer just learned, if you’re not a signatory to the agreement, you better make sure the arbitration clause is drafted broadly enough to cover you to avoid litigating Illinois Biometric Information Privacy Act (“BIPA”) claims in court.  Last week in Sosa v. Onfido, Inc., 2021 U.S. Dist. LEXIS 658 (N.D. Ill.), a judge in the Northern District of Illinois refused to motion to compel arbitration for litigation brought under BIPA, finding that the arbitration agreement did not cover the defendant.  Read on below.

The plaintiff in Sosa had an account with Offerup, Inc., a marketplace where people buy and sell goods online.  According to the pleadings, OfferUp partnered with the defendant, Onfido, to establish users’ identities. Specially, the plaintiff alleged that users (including himself) upload their driver’s license or ID along with photos of their faces, and that Onfido’s software scans the images and extracts biometric identifiers in order to confirm if they match the uploaded IDs. The plaintiff filed a putative class action complaint, alleging that Onfido violated BIPA by allegedly collecting and storing biometric information without obtaining written releases and providing certain required notices.

Onfido invoked the arbitration provision in OfferUp’s Terms of Service, which Onfido claimed the plaintiff  agreed to when he registered for OfferUp and each time he accessed his account. Ordinarily, as a matter of Illinois law, only signatories to an arbitration agreement can enforce it, but Onfido argued that three court-recognized exceptions to this rule applied: (1) third-party beneficiary, (2) equitable estoppel, and (3) agency.

The court disagreed.  The court first found that Onfido was not an intended beneficiary of OfferUp’s Terms of Service because the language in the arbitration provision did not extend to anyone but OfferUp and its users. Notably, this may have come out differently if the arbitration clause had explicitly covered disputes between users and Onfido, or even perhaps disputes between users and unspecified third-party providers, but the agreement here just referenced OfferUp.

The court also rejected the argument that equitable considerations required arbitration because, according to the court, under state-law principles, there was no indication that Onfido detrimentally relied on any representations about arbitration.  In this discussion, while the court recognized the liberal policy favoring arbitration agreements, it noted that this policy “has its limits,” and “[n]othing authorizes a court to compel arbitration of any issues, or by any parties, that are not already covered in the agreement.”

Finally, although recognizing that in Illinios, “an agent may invoke an arbitration agreement entered into by its principal,” the court held that Onfido was not an agent of OfferUp.  The court noted that “[c]ompanies routinely partner with one another to provide services to customers without acting as one another’s agents,” and there was nothing in the record to demonstrate a principle-agent between OfferUp and Onfido.

So there you have it.  Another day, another data privacy litigation.  Here, the plaintiff’s BIPA claims against this ID-verifying software developer will be resolved in federal court, not arbitration.  Stay tuned.

In case you missed it, below is a summary of recent posts from CPW.  Please feel free to reach out if you are interested in additional information on any of the developments covered.

Ninth Circuit Sides with CFPB in Ratification Fight | Consumer Privacy World

Smart Homes and Liabilities: A Brave New World | Consumer Privacy World

Government Users of Facial Recognition Software Sued by Plaintiff Alleging Wrongful Imprisonment Over Case of Mistaken Identity | Consumer Privacy World

As 2020 drew to a close, the Ninth Circuit gave the CFPB a victory in Consumer Fin. Prot. Bureau v. Seila Law LLC, 2020 U.S. App. LEXIS 40572 (9th Cir. Dec. 29, 2020), upholding the CFPB’s civil investigative demand (CID) to Seila Law.  The case was on remand from the United States Supreme Court, which held that the statute establishing the CFPB violated the Constitution by placing leadership of the agency in the hands of a single Director who could only be removed for cause.  Seila Law LLC v. CFPB, 140 S. Ct. 2183 (2020).  The Supreme Court, however, concluded that the for-cause provision of the statute could be severed and did not require the invalidation of the entire agency; it then remanded the case back to the Ninth Circuit to determine whether the CFPB’s ratification of its earlier decision to issue the CID to Seila Law was valid.  Just over a month after hearing oral argument on the ratification question, a unanimous panel of the Ninth Circuit held that on July 9, 2020, the CFPB’s current Director, Kathleen Kraninger, validly ratified the agency’s earlier decision to issue a CID to Seila Law.

The Ninth Circuit quickly disposed of the two primary arguments put forth by Seila Law to challenge Director Kraninger’s ratification of the Seila Law CID.  First, relying on Federal Election Commission v. NRA Political Victory Fund, 513 U.S. 88 (1994), Seila Law argued that because the agency lacked the authority to issue the CID back in 2017, Director Kraninger’s 2020 ratification of such action was not valid.  In other words, according to Seila Law, an action that was void at the time taken could not be later ratified.  Finding the argument “largely foreclosed” by its earlier decision in Consumer Fin. Prot. Bureau v. Gordon, 819 F.3d 1179 (9th Cir. 2016), the Ninth Circuit concluded that the “the constitutional infirmity relates to the Director alone, not the legality of the agency itself” and that the defect with the provision relating to the removal of the Director did not “render[] all of the agency’s prior actions void.”  As the Ninth Circuit noted, if that were the case, then there would have been no reason for the Supreme Court to remand the ratification question back to the Ninth Circuit.

The Ninth Circuit concluded that Seila Law’s second argument—that the ratification took place outside of the limitations period for bringing an enforcement action—was premature.  The statutory limitations period relied upon by Seila Law applies only to the bringing of an enforcement action, which has not happened here.  “The only actions ratified by Director Kraninger are the issuance and enforcement of the CID” against Seila Law.  Whether Seila Law could successfully bring a statute-of-limitations defense to any future enforcement action has no bearing on the validity of the Director’s ratification of the CID to Seila Law.

The Ninth Circuit’s decision confirms our earlier blog (LINK HERE) that defendants seeking to challenge Bureau actions taken before the Supreme Court invalidated the statute’s removal provision have an uphill battle.  The ratification issue is teed up in other cases around the country, so stay tuned to see whether any court sees the ratification issue differently than the Ninth Circuit.

The technology that science fiction promised us has finally arrived, but accompanying it are new duties, liabilities, and causes of action.  Smart homes, or homes interfaced with internet functionality, are growing in popularity.  In a smart home, features like door locks and appliances may be connected to the internet, allowing consumers to remotely control or perform tasks from wherever they can maintain an internet connection.  Many consumers have noticed the promised convenience of smart homes and experts predict a coming business surge.  For instance, in the U.S., some experts predict that the industry’s revenue could reach $141 billion by 2023.  Yet although we may find ourselves on the cusp of new lifestyles and conveniences, consumers and industry would be well-advised to look at a recent opinion that previews the liability issues that will inevitably emerge.

In Doty v. ADT, LLC, No. 20-cv-60972, 2020 U.S. Dist. LEXIS 245373 (S.D. Fla. Dec. 30, 2020), the court granted in part and denied in part a motion to dismiss a consumer’s class action lawsuit initiated by misconduct with her smart home system.  The plaintiff had her home outfitted with smart home technology, including cameras inside and outside of her home and locks that could be controlled through an internet connection.  The trouble began when the technician that installed the system gave himself remote access.  According to the opinion, the employee accessed the plaintiff’s account over 70 times.  He allegedly viewed and downloaded footage from the security cameras inside and outside of her home.

The plaintiff filed a class action lawsuit on behalf of herself and all customers “whose security systems were remotely accessed by an employee or agent” of the defendant “without authorization from the customer”.  The plaintiff alleged several state law causes of action—including breach of contract, negligence, violations of the Texas Deceptive Trade Practices Act, intrusion upon seclusion, intentional infliction of emotional distress, privacy monitoring, and negligent hiring, supervision, and retention—and one federal claim arising under the Computer Fraud and Abuse Act at 18 U.S.C. Section 1030.

Doty has a number of takeaways, but three stood out to us.

First, the court’s reasoning behind its decision to not dismiss the breach of contract claim suggests an implied duty to protect consumers from invasions of their privacy.  Although the plaintiff’s contract contained an express waiver of implied covenants, the court found an implied covenant “to supply a security system reasonably secure from unauthorized access.”  The court agreed with the plaintiff’s argument that “the contract necessarily implie[d] an agreement that the security monitoring services would be secure from intrusion” by the defendant’s employees, opining that, “A contract for a security monitoring service that is itself unsecure is a contract for nothing at all.”  (Internal quotations omitted).

Second, in upholding most of the negligence claims, the court recognized a duty to protect consumers from unauthorized intrusions of their privacy and found that physical injury was not required for a damages award.  Specifically, the court stated the defendant had a duty to “reasonably protect Plaintiff from invasions of privacy through unauthorized access of that system and that Plaintiff may recover damages for mental anguish caused by a breach of that duty, even in the absence of physical damages.”  The court did not discuss what actions would satisfy this duty or the defendant’s business practices, which leaves an area that may be explored on summary judgment.

Third, the court recognized that there was no cause of action for privacy monitoring, but did not completely close the door on injunctive relief.  The plaintiff requested injunctive relief, which included requiring the defendant to “create a fund sufficient to cover the costs of commercial and/or legal services needed to remedy the invasion of privacy that they have suffered”.  The court granted the defendant’s motion to dismiss this claim, finding it was not a viable cause of action, but recognized that “injunctive relief may be an available equitable remedy in the event” the defendant “is held liable” on other claims.

If the experts are correct, smart home technology will only continue to proliferate.  Doty, however, suggests technology will not be the only force to grow.  Liabilities, duties, and causes of action will likely continue to grow as the industry develops.  Doty is a case that we will be watching.

The world of digital marketing has grown exponentially in the last two decades.  In fact, it was estimated that in 2020, despite the global pandemic, approximately $332.84 billion will be spent on digital advertising worldwide.[1]  Not surprisingly, sophisticated algorithms (such as real-time bidding and programmatic ad buying) have been built in recent years to master the science of digital marketing and customer segmentation-aka target marketing.  While none of the current U.S. privacy laws explicitly prohibit target marketing based on electronically obtained consumer data, this space is getting over populated, and over regulated, and the landscape is changing.  And so we ask the obvious question, can target marketing withstand the emerging privacy regulations? Our answer is probably, with certain notable caveats.

Target marketing is an old but powerful marketing strategy.[2]  It used to involve breaking consumers into defined segments where each segment shared some similar characteristic, such as, gender, age, buying power, demographics, income, or a combination of a few shared characteristics; then designing marketing campaigns based on the shared characteristic(s).  Approaches have changed with the passing of time.  Nowadays, target marketing has been narrowed to the point of defining every individual consumer or household, and designing marketing campaigns for each individual consumer or household.  Target marketing is often the key marketing tool used to attract new business, increase sales, or strengthen brand loyalty.[3]  Despite its success, with the massive amount of consumer data now being used to target consumers, and the emerging data privacy laws and regulations, marketers have to tread carefully to avoid getting themselves in (legal) hot water.

How do marketers access consumer data?  And why is it potentially problematic?

Lets first address consumer data.  Marketers can acquire data by themselves, (aka, “first party data”).  This includes data from behaviors, actions or interests demonstrated across website(s) or app(s), as well as data stored in a business’ customer relationship management system (“CRM”).[4]  By contrast, “second party data” or “third party data” is data acquired from another source.  It could be someone else’s first party data, or it could be data collected by outside sources that are not the original collectors of the data.[5]

The most common method for obtaining consumer data (first, second or third party) over the internet has been through cookies stored on our digital devices.[6]  (For a recent litigation involving the use of cookies in the context of kids’ privacy rights see this prior post).  Cookies are used to track the activities of devices as users visit particular web pages, allowing advertisers to build profiles of a device’s online activities; these profiles can then be used to create targeted advertising tailored to the user of that device.[7]

Marketers are also able to obtain data through social media platforms.  Most of us using social media are aware of the personal information we submit before we create our accounts.  This information may include some personally “identifiable” information, such as our name, address, date of birth etc., but there is other personal information which is not considered “identifiable”, such as our gender, age, postal code, etc.  Marketers can then partner with social media platforms to create marketing campaigns based on consumer segments created through each individual’s personal information.  Ever wonder why your husband is not seeing ads for women’s shoes, or why you are receiving ads for products or services you have not shopped for but may be interested in?  It is target marketing.  (And of course, as CPW has covered, data can also be harvested from social medial platforms through scraping).

So what?  Well, until recently (with a few notable exceptions such as the Fair Credit Reporting Act (“FCRA”)) laws regulating companies selling or acquiring consumer data were sparse and preceded the advent of new technologies.  Compare Trans Union LLC v. FTC, 536 U.S. 915, 917 (2002) (stating that “the FCRA permits prescreening—the disclosure of consumer reports for target marketing for credit and insurance. . . .”) with FTC I, 81 F.3d 228 (D.C. Cir. 1996) (holding that selling consumer reports for target marketing violates the FCRA).

In many respects, corporations were thus able to use consumer data to create complex marketing campaigns.  This practice recently came up in the context of the Capital One data breach.  See, e.g., In re Capital One Consumer Data Sec. Breach Litig., 2020 U.S. Dist. LEXIS 175304, at *28 (E.D. Va. Sep. 18, 2020) (discussing plaintiffs’ allegation that “Capital One created a massive concentration of [personally identifiable information, a ‘data lake,’ in which Capital One ‘mines [customers’] data for purposes of product development, targeted solicitation for new products, and target marketing of new partners—all in an effort to boost its profits.”).

The tide is starting to change.  With the emergence of more recent data privacy laws, such as the California Privacy Rights Act of 2020” (“CPRA”), the California Consumer Privacy Act of 2018 (“CCPA”) and General Data Protection Regulation (“GDPR”), “covered entities” can no longer use personal information carte blanche for advertising purposes.  However, it bears noting that the statutory definition of personal information remains much narrower than what one might assume.   CCPA for example defines personal information as: “…information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household…”  California Consumer Privacy Act of 2018 §1798.140.(o)(1).

Thus, information about one’s gender and income, without more, would not be fall under this definition.  Are consumers comfortable to have this information used without their consent?  Do they even have a choice?  It depends.  Although common law tort principles, such as invasion of privacy, embarrassment or emotional distress, may allow some legal remedies, case law is sparse and for obvious reasons, has trended towards permitting corporate use of such data.  See, e.g., Bradley v. T-Mobile US, Inc., 2020 U.S. Dist. LEXIS 44102 (N.D. Cal. Mar. 13, 2020) (rejecting claim that use of consumer data, including age, for target marketing concerning online job postings constituted age discrimination and violated various federal and state laws).

At least insofar as California is concerned, there has been some interesting developments concerning target marketing of late.  This is because under CCPA, some businesses engaged in target marketing interpreted “sales” as excluding the exchange of personal information, such as cookie data, for targeting and serving advertising to users across different platforms.  This approach was on the purported basis that no “sales” (as defined in the statute) were involved because no exchange for “valuable consideration” had occurred.  The CPRA, which was approved by California voters in November, utilizes the concept of “sharing” and seemingly eliminates this potential loophole (although that doesn’t mean there won’t be future litigation regarding this issue).

The concept of “data clean rooms” as also (re)surfaced to bypass the issues related to sharing customer data.  Data clean room allow companies, or divisions of a single company, to bring data together for joint analysis under defined guidelines and restrictions that keep the data secure[8].  Whether a clean room contains PII or anonymized data, data privacy practices are critical.  If the anonymized data can be deanonymized (tied back to actual people through creative analytics), it would make the data subject to most privacy laws (and definitely the GDPR).

What does the future look like for digital advertising?  With the spike in US state regulations relating to consumers’ online privacy, such as, CPRA, the Nevada Senate Bill 220 Online Privacy Law (2019), and the Maine Act to Protect the Privacy of Online Consumer Information (2019)[9], it remains fluid.  There has also been changes in cybersecurity, data security and data breach notification laws (although we will table discussion of the specifics of that for another day).  The bottom line is that marketers now not only have to pay extra attention to each state’s regulation before obtaining and/or processing consumer information, they also have to pay extra attention to the consent obtained.  The free reigns of using unlimited consumer data to create complex algorithms for the optimal marketing campaign is slowly coming to a halt.

To mitigate litigation risk, entities in the marketing industry will have to take a jurisdiction specific approach that accounts for recent developments.  And as the scope of these new laws and regulations are tested via litigation, CPW will be there every step of the way.  Stay tuned.

[1] https://www.emarketer.com/content/global-digital-ad-spending-update-q2-2020

[2] https://www.acrwebsite.org/volumes/8572/volumes/v29/NA-29

[3] https://www.thebalancesmb.com/target-marketing-2948355

[4] https://www.lotame.com/1st-party-2nd-party-3rd-party-data-what-does-it-all-mean/#:~:text=First%20party%20data%20is%20the,you%20have%20in%20your%20CRM

[5] Ibid.

[6] Swire, Peter and Kennedy-Mayo, DeBrae, “U.S. Private-Sector Privacy,” Third Edition,  Pg 130

[7] Ibid.

[8] https://www.snowflake.com/blog/distributed-data-clean-rooms-powered-by-snowflake/

[9] https://www.csoonline.com/article/3429608/11-new-state-privacy-and-security-laws-explained-is-your-business-ready.html

It has become commonplace for government agencies and law enforcement, particularly in large metropolitan areas, to use facial recognition software.  These practices, though, have garnered recent public attention and some controversy. In response to concerns raised by media coverage of Clearview’s practices, three cities last year banned their governments from using facial recognition technology, and another banned all corporate uses of facial recognition technology in public spaces.  However, for the most part government utilization of facial recognition software has continued unabated.

But using such software is not without risks, as shown by a lawsuit recently filed against law enforcement officers and prosecutors.  In Parks v. McCormack, et al., Case No. L-003672-20 (N.J.), the plaintiff alleges he spent ten days wrongfully imprisoned after facial recognition software used by a New Jersey police department mistakenly identified him as a suspect in a criminal investigation.  This was allegedly notwithstanding that the plaintiff’s fingerprints and DNA did not match those left at the scene of the crime, and that the plaintiff provided an alibi at the time of his detention.  The complaint alleges that the police department involved was relying solely on facial recognition technology in issuing the warrant for the plaintiff’s arrest.  Plaintiff filed suit against the police, the prosecutor, and the municipality involved for false arrest, false imprisonment and violation of his civil rights.  The lawsuit comes almost one year after New Jersey’s attorney general asked state prosecutors to stop using Clearview AI’s app and announced an ongoing investigation into it and similar facial recognition software.

The plaintiff is the third person reported to have been falsely arrested based on an incorrect facial recognition match.  Notably, in all three instances the individuals mistakenly identified by the software were Black men—underscoring racial bias concerns previously raised about the adoption of facial recognition technology by government bodies.  The Parks lawsuit names as defendants the officials and government entities involved in the plaintiff’s allegedly wrongful detention and imprisonment.  However, due to the doctrine of governmental immunity, which shields the government from liability for the actions of state or federal employees under certain circumstances, future litigations may also seek to bring direct claims against the manufacturers of such software – and one such manufacturer, Clearview, is certainly no stranger to privacy litigation.  Stay tuned.