With the first quarter of 2022 at a close, litigation involving the collection and protection of biometric data has taken off to a hot start, setting a fervent pace that could mean big things for data privacy litigation for 2022 (with crossover impact on data breach and cybersecurity litigations, as outlined below).  Read on to see what trends CPW has seen, and which topics we will be keeping our eyes on as the year continues.  For more information, be sure to register for our webinar on April 5 from 12-1 pm on “Developments and Trends Concerning Biometric Privacy and Artificial Intelligence.”

I.     New Biometric Privacy Cases Filed in Q1 2022

At time of writing, more than one hundred and ten cases have been filed related to biometric data privacy.  It should come as no surprise to regular CPW readers to learn that nearly all of these cases were prospective class action claims filed in Illinois alleging damages under the Biometric Information Privacy Act (“BIPA”).  For those new to CPW, BIPA is a state statute that provides state citizens with a private cause of action if their biometric information has been collected or shared without their informed consent.

Some quick statistics about these BIPA cases:

  • The majority, more than sixty-five cases, involved claims resulting from fingerprints captured by timekeeping machines by the plaintiffs’ employer.
  • Twenty-five of these litigations involved allegations that the fingerprints were collected without the plaintiff’s knowledge or consent, while nineteen complaints alleged that the employer failed to provide the plaintiffs with information relevant to the recording and retention of the information. Additionally, thirteen litigations filed sought damages alleging that plaintiff’s employer failed to safeguard the data from third parties and/or hackers.  Finally, eight plaintiffs simply alleged that the employer never obtained written consent as required under the statute.
  • Eleven litigations were filed concerning allegations that the defendant had obtained the plaintiffs’ facial geometry without knowledge or informed consent, or without safeguarding the information from third parties—a growing area of BIPA litigation, consistent with prior trends.
  • Moreover, ten cases concerned claims involving the collection of voice recognition data—another growing area of potential litigation risk.

II.   Biometric Privacy Cases to Watch in 2022

CPW has identified a number of biometric cases as ones to keep an eye on as the year progresses.  This includes:

Stein v. Clarifai, Inc., No. 22 CV 314 (D. Del.): After winning dismissal of a BIPA class action filed in Illinois on personal jurisdiction grounds (covered by SPB team member David Oberly for Bloomberg Law here), AI software developer Clarifi found itself hauled into court once again—this time in Delaware—for purportedly running afoul of Illinois’s biometric privacy statute.  In that case, Stein v. Clarifai, Inc., Clarifai—which specializes in machine learning to identify and analyze images and videos using facial recognition technology—improperly harvested facial template data from OkCupid dating profile photos without providing notice or obtaining consent.  If this procedural posture seems familiar to some, that is because it parallels another recent BIPA class action involving a cloud-based call center entity and its integrated voiceprinting technology provider—which was also refiled in Delaware after being dismissed in Illinois due to an absence of personal jurisdiction.  The plaintiffs in the earlier voiceprint class action fared no better the second time around, with a Delaware federal court dismissing the re-filed suit based on a successful extraterritoriality challenge.  Only time will tell if the Clarifi suit will be able to avoid the same fate.

Roberts v. Cooler Screens Inc., No. 2022-CH-0184 (Ill. Cir. Ct. Cook Cnty.):  In another recently-filed case, Roberts v. Cooler Screens Inc.,  Cooler Screens’s “Smart Coolers” have been targeted for allegedly improper biometric data collection practices that purportedly violate BIPA.  “Smart Coolers” replace refrigerator cases in retail stores, replacing the doors with digital screens that provide an “interactive experience” to customers.  This experience, according to the complaint, includes a “facial profiling system” that “detect[s] the age, gender, and emotional response of over 3 million verified daily viewers.”    The facial recognition system analyzes each customer, determining which advertisements and suggestions are most likely to lead to a purchase.  While some might view this as an exciting potential advertising opportunity, the plaintiff saw otherwise.  The technology in the litigation was characterized as an unlawful collection of biometric data, and a violation of BIPA’s requirements to provide information and obtain consent.  This will be worth watching, as the overlapping space between developing technology and efforts to ensure the privacy of biometric data is likely to lead to further litigation in the near future.

Copple v. Arthur J Gallagher & Co., No. 22 CV 116 (W.D. Wash.): Outside of BIPA claims, some litigants have alleged harms emerging from biometric data in other contexts.  In Copple v. Arthur J Gallagher & Co., a ransomware attack has resulted in a prospective class-action lawsuit filed against the defendant, “one of the leading insurance brokerage, risk management, and HR & benefits consulting companies in the world.”  The plaintiffs in this action allege that a number of the defendant’s clients provided the defendant with the plaintiffs’ personally identifiable information (“PII”) and protected health information (“PHI”), without the plaintiffs’ knowledge or consent.  According to the complaint, the defendant was struck by a cyberattack beginning in June 2020, only discovering the attack on September 26, 2020.  The company allegedly did not begin notifying plaintiffs of the breach, however, until more than nine months later, in July 2021.  Over the next six months, the company provided almost weekly reports to the state Attorney General, which included an increasing number of individuals affected, beginning with only 1,825 Washington residents in its initial July 13, 2021 report, and cumulating in 72,835 affected by December 6, 2021.  Plaintiffs seek damages claiming that the PII and PHI are likely to appear on the dark web, and the class members were harmed by the significant delay in notifying the affected class members.

III.     Notable 2022 Trends in Biometric Privacy Litigation

From a broader perspective, there are several areas of activity in BIPA class action litigation that are worth keeping an eye on as we head into the second quarter of 2022.

Voiceprints, Take II: One noteworthy trend that has developed since the start of the year is an increased volume of BIPA class action filings targeting voice biometric technologies.  Voice biometrics (also known as a “voiceprint”) relies on the analysis of unique voice patterns to identify or verify individuals’ identities.  In other words, this is the use of biological characteristics—one’s voice—to verify an individual’s identity.  Voiceprints can be distinguished from general voice data, which merely captures a person’s voice without analyzing the components of the voice and/or generating a voiceprint for the purpose of verification or identification.  While voiceprints fall under BIPA’s scope, courts have held that general voice data does not, with the important dividing line being the identifying quality of the identifier or other biometric information.

In mid-2021, a wave of lawsuits was filed targeting voice-powered technologies—including a high-profile suit involving McDonalds’ drive-thru voice assistants, which SPB team member Kristin Bryan covered extensively in CPW articles here, here, and here—the majority of this litigation fell flat because the technology at issue ultimately did not involve voiceprints, but rather tech that merely captured or used individuals’ voice data.  It appears that enterprising plaintiffs’ attorneys have again turned their attention to voice data in 2022, with one main difference.  This time around, these BIPA class actions are focusing narrowly on voice data that is used specifically for time and attendance purposes.  Because timekeeping necessarily involves the verification of individuals’ identities, there is a reasonable likelihood that this round of filings may be different than 2021, where the majority of suits were dismissed within a short period of time after they were filed.

Additional Uses of Facial Recognition: Similarly, there has also been a wave of new BIPA filings focused on targeting timekeeping systems that utilize facial recognition software.  While facial biometrics has long been one of the most popular targets for BIPA class actions, in the timekeeping context these actions have traditionally been confined to the use of fingerprint time and attendance systems.  That is no longer the case in 2022.

Facial Recognition Cameras Used for Vehicle Monitoring: Facial recognition-powered cameras used to monitor vehicle fleets and their drivers has also emerged as a new favorite target for BIPA class actions.  Transportation companies are increasingly relying on facial recognition cameras to analyze video collected from cameras mounted on the interior windshields of vehicles in their fleets to monitor driver activity and protect these companies against losses from vehicle accidents.  The AI technology that is used by these facial recognition cameras allows for the monitoring of external variables such as cars and road signs.  More importantly, this AI tech allows the devices to continually monitor and classify their drivers’ status, including whether they are being attentive at the wheel.

According to recently-filed suits, these cameras also collect drivers’ facial data and analyze it to detect certain types of driver behavior, like distracted or drowsy driving, then uses a built-in cellular data link to upload the video, biometrics, and other data to the company’s servers.  While these suits allege that these cameras scan drivers’ facial geometry—which, if true, would bring these cameras within the scope of BIPA—it is uncertain whether this technology actually satisfies the definition of “facial recognition” under the law.  Importantly, this trend illustrates the complex compliance decisions that arise when attempting to mitigate BIPA liability exposure in connection with new and advanced technologies where courts have not clearly addressed  whether they fall under the scope of Illinois’ biometric privacy law—and the need to consult with experienced biometric privacy counsel before rolling out any new type of biometric- or AI-related technology to ensure legal risks are addressed to the greatest extent possible.

IV.   Recent Significant Decisions in Biometric and AI Privacy Litigation

Several significant decisions concerning biometric and AI litigation have been handed down in Q1 2022.  Below, we highlight a few of these decisions as potential trends for 2022 litigation.

BIPA & Personal Jurisdiction:  As noted above, BIPA provides a cause of action for Illinois residents who believe their biometric information has been obtained or disclosed without consent.  One recent decision confirmed, though, that even suits brought by Illinois residents do not automatically signify that there is personal jurisdiction, and a plaintiff or putative class members may not be a sufficient connection to Illinois.

In Gutierrez v. Wemagine.ai LLP, 2022 U.S. Dist. LEXIS 14831 (N.D. Ill. Jan. 26, 2022), Plaintiffs claimed that defendant’s app obtained and disseminated the biometric information of its users without their written consent in violation of BIPA, but defendant was a Canadian company and its only contacts with IL were app downloads in the state.  Defendant moved to dismiss for lack of personal jurisdiction, which the court granted, finding that defendant had not “targeted” the forum of Illinois (such as through marketing or sales).  While this is consistent with what many other courts have found with respect to personal jurisdiction, it sets an important precedent in the BIPA context that, without more, a plaintiff and/or putative class members are not a sufficient connection to Illinois for the purposes of BIPA – personal jurisdiction must still be proper and comport with due process.  It is also a reminder to entities sued under BIPA to thoroughly examine whether personal jurisdiction is proper and to emphatically litigate the issue if it is not.  This may explain the choice of venue in Stein v. Clarifai, Inc., and may suggest that more BIPA claims will be filed in out-of-state courts going forward.

BIPA Preemption: In a continuation from a 2021 trend, one often-raised defense to a BIPA suit is that the BIPA claims are preempted by a federal or Illinois state statute.  Three recent decisions continue to demonstrate the strength of this complete liability defense in BIPA litigation.

Federal Litigation: Just recently, an Illinois federal court in Kislov v. Am. Airlines, Inc., No. 17 CV 9080, 2022 U.S. Dist. LEXIS 50481 (N.D. Ill. Mar. 22, 2022), dismissed a BIPA class action against airline giant American Airlines arising out of the company’s use of integrated voice response (“IVR”) software into its customer service hotline.  IRV is the “robot voice” that a caller hears when calling a customer support hotline.  Of note, the software also collects, stores, and analyzes callers’ voiceprints to understand and predict callers’ requests and track interactions with callers over time.  According to the plaintiffs, American Airlines deployed this voiceprint technology without providing customers notice or obtaining their consent in violation of BIPA. The airline moved to dismiss the action, arguing that the suit was preempted by the Airline Deregulation Act (“ADA”).  The court agreed, finding that American Airlines’ use of the IVR software was covered under the ADA’s preemption provision because it concerned the services the airline provided to customers.

Kislov is by no means the first BIPA action to be dismissed based on a successful preemption challenge.  Federal courts have dismissed a number of BIPA class actions on preemption grounds under the Railway Labor Act (“RLA”) and § 301 of the Labor Management Relations Act (“LMRA”).  Significantly, however, both the RLA and the LMRA apply in the context of unionized employment relationships.  Kislov is noteworthy because it demonstrates that the preemption defense is not limited to employers in BIPA litigation, but can also be deployed in a much broader range of contexts, including to defeat biometric privacy class actions filed by customers or other consumers.

State: An Illinois appellate court recently confirmed that federal law may preempt BIPA in certain circumstances.  In Walton v. Roosevelt Univ., 2022 Ill. App. LEXIS 83 (Ill. Ct. App. Feb. 22, 2022), Plaintiff, who belonged to a union, filed suit seeking damages from his employer for alleged BIPA violations, including collection, storage, use, and dissemination of his biometric data, as well as disclosure to a third party payroll service.  Defendant employer moved to dismiss on the grounds that Plaintiff’s claims were preempted because he was covered by a collective bargaining agreement, and the Court agreed.  It found that, while Plaintiff and other members of the putative class who were unionized employees were not prohibited from seeking redress, the collective bargaining agreement required them to seek it through the grievance procedures laid out in the agreement.  Subsequently, the Court expressly found that BIPA “claims asserted by bargaining unit employees covered by a collective bargaining agreement are preempted under federal law.”

On the other hand, a landmark ruling from the Illinois Supreme Court held that BIPA was not preempted by the Illinois Workers’ Compensation Act.  In McDonald v. Symphony Bronzeville Park, 2022 Ill. LEXIS 194 (Ill. Feb. 3, 2022), the Illinois Supreme Court found that the Illinois Workers’ Compensation Act does not preempt BIPA claims, and, as such, an employer may be subject to liability for damages claims under BIPA as well as liability under the workers’ compensation framework.  Employees of a nursing home had sued, alleging violations of their rights under BIPA based on the employer’s practice of scanning employees’ fingerprints as a means of timekeeping.  The employer argued that employees could not seek damages under BIPA because the violation had occurred at work, and so plaintiffs’ exclusive remedy was to seek compensation under the Workers’ Compensation Act.  The Illinois Supreme Court disagreed, finding that violations of BIPA are not a “compensable injury” under the state Workers’ Compensation Act because whether an injury is compensable depends both on where the injury occurred and the nature of the injury, and other compensable injuries under the statute differed greatly from the “personal and societal injuries” under BIPA.

With the first quarter just about wrapped up, 2022 is promising to be an exciting year in AI and biometric data litigation.  As the year continues, be sure to stay tuned; CPW will be your go-to source to stay on the forefront of all new developments in real time.

As seasoned data privacy and biometric litigators are already aware, the United States does not have a comprehensive federal law regulating the collection, processing, disclosure, and security of personal information (“PI”)—typically defined as information that identifies, or is reasonably capable of being linked to, an individual.  Rather, a patchwork of federal and state sectoral laws

A federal court recently dismissed biometric litigation brought against a marketer and seller of video technology products.  Jacobs v. Hanwha Techwin Am., Inc., 2021 U.S. Dist. LEXIS 139668 (N.D. Ill. July 27, 2021).  Although at least two other prior cases had allowed similar claims against a third-party technology provider to proceed into discovery, the court found Plaintiff’s allegations in this instance distinguishable.  Read on to learn more.

First, a recap.  The Illinois Biometric Information Privacy Act (“BIPA”) was enacted in 2008 and has standards regarding the retaining and handling of the biometric data of Illinois residents.  As readers of CPW know, BIPA protects the “biometric information” of Illinois residents, which is any information based on “biometric identifiers” that identifies a specific person—regardless of how it is captured, converted, stored, or shared.  740 ILCS 14/10.  Biometric identifiers are, “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.”  Id. (collectively, with “biometric information,” “biometric data”).

Now, a closer look at the allegations in Jacobs.

Plaintiff alleged that when shopping in December 2020, he saw several of Defendant’s security cameras installed at the entrance of a T.J. Maxx store in downtown Chicago.  Plaintiff also alleged that he purportedly learned about the cameras’ “ability to perform facial recognition” during that shopping trip (cue CPW eye roll).  Plaintiff asserted that Defendant collected his biometric data though facial recognition technology in the security cameras “to track, identify, and prosecute shoplifters.”

Plaintiff raised a litany of claims under BIPA for Defendant failing to provide notice that that it is collecting and storing biometric data, and alleged “upon information and belief” that Defendant disclosed such data in violation of the statute. Plaintiff sought certification of the following class: “[a]ll individuals in the State of Illinois who had their facial geometry scans, biometric identifiers, and/or biometric information collected, captured, received, or otherwise obtained, maintained, stored, disclosed, or disseminated by defendant during the applicable statutory period.”

Defendant in moving to dismiss focused on what was tellingly absent from the Complaint:

  • Plaintiff does not allege that Defendant installed the cameras, operated the cameras, or in any way accesses or controls T.J. Maxx’s security system.
  • Plaintiff also does not allege that Defendant operates any systems or servers to store any information captured by the cameras.
  • Instead, Plaintiff’s complaint suggests that Defendant’s only alleged connection to those cameras was its role as the manufacturer and distributor.

The court ultimately sided with defendant, finding Plaintiff’s claims were conclusory or otherwise failed to state a cognizable claim under BIPA.

First, looking at Plaintiff’s Section 15(b) BIPA claim, the court found that Plaintiff’s allegations merely parroted the language of the statute.  Recall that unlike Sections 15(a), (c), (d), and (e) of BIPA—all of which apply to entities “in possession of” biometric data—Section 15(b) applies to entities that “collect, capture, purchase, receive through trade, or otherwise obtain” biometric data.  740 ILCS 14/15(a)-(e). Additionally, “mere possession of biometric data is insufficient to trigger Section 15(b)’s requirements.”  However, Plaintiff argued (in reliance on the “otherwise obtain” language in Section 15(b)) that the provision applies to any private entity that obtains biometric data, no matter the source or manner of collection.

The court rejected this interpretation as “flawed.”  Following the rulings of other courts applying BIPA, it held that “for Section 15(b)’s requirements to apply, an entity must, at a minimum, take an active step to collect, capture, purchase, or otherwise obtain biometric data.” (emphasis supplied).  However, here Plaintiff failed to adequately alleged that Defendant took any active steps to collect biometric data.  Instead, the allegations in the complaint made clear to the court that Defendant is “a third-party technology provider (that is, merely provided the cameras), and that the active collector and processor of the data is T.J. Maxx.” (emphasis supplied).

Second, the court rejected Plaintiff’s Section 15(a) and (d) claims as similarly fundamentally flawed.  This was because, the court explained, Sections 15(a) and 15(d) of BIPA apply to entities “in possession of” biometric data. 740 ILCS 14/15(a), (d).  Because BIPA does not define “possession,” courts have routinely used the ordinary definition of the word.  Accordingly, possession for purposes of BIPA occurs when someone “exercise[es] any form of control over the data or … held the data at [his] disposal.”  However, the court held, Plaintiff “does not provide any factual allegations that plausibly establish that defendant exercised control over plaintiff’s data or otherwise held plaintiff’s data at its disposal.” (emphasis supplied).

Plaintiff’s complaint was dismissed.  Notably, at least two other BIPA cases involving claims against a third-party technology provider made it past a motion to dismiss.  However, unlike Jacobs, the factual allegations in those prior cases made clear that the manufacturers of the fingerprint scanners had themselves collected, obtained, or stored the biometric data.  For more on this developing area of the law, stay tuned.  CPW will be there.

2021 was another record setting year for biometric litigation, with class action plaintiffs bringing new AI-based consumer privacy claims and a continuing trend of employment-based disputes.  Read on for CPW’s highlights of the year’s most significant events concerning biometric litigation, as well as our predictions for what 2022 may bring.

Overview of 2021 BIPA Litigations: What Do the Numbers Show?

One of the most critical consumer privacy statutes for biometric litigation has been Illinois’ Biometric Information Privacy Act (“BIPA”), which regulates the collection, processing, disclosure, and security of the biometric information of Illinois residents.

BIPA protects the “biometric information” of Illinois residents, which is any information based on “biometric identifiers” that identifies a specific person—regardless of how it is captured, converted, stored, or shared.  740 ILCS 14/10.  Biometric identifiers are “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.”  Id. (collectively, with “biometric information,” “biometric data”).  BIPA has found itself to be one of the most frequent targets for class actions, as it includes a private right of action with liquidated statutory damages, unlike many other data privacy statutes.  Plaintiffs bringing suit under BIPA may seek actual damages or liquidated damages of either $1,000 per violation for negligent violations or $5,000 per violation for intentional or reckless violations.

The number of complaints filed under BIPA held steady in 2021, with heavy case volume cited as one of the reasons that comprehensive privacy legislation with a private right of action failed to be enacted by the Florida legislature.  In 2021, at least 89 court rulings referenced BIPA.  This is more than a four-fold increase from 2019.  While the overwhelming majority of these rulings came from federal courts within the Seventh Circuit, BIPA decisions were also issued by Illinois state courts and federal courts within the Third, Fourth and Ninth Circuits.

Settlement activity under BIPA was also consistent with these other litigation trends.  2021 saw multiple BIPA settlements.  Although the largest settlement ($650 million) was announced early in the year with a technology company, there were numerous others (with significant variation in settlement amounts).

To list just a few examples, in April a Cook County judge granted final approval to a $25 million class-action settlement to end a putative class-action brought against technology company ADP concerning its provision of biometric scanning technology to employers for timekeeping purposes.  Later, in June the parties to the seminal Six Flags litigation (where the Illinois Supreme Court held a plaintiff could recover even for technical violations of BIPA in the absence of actual harm) received preliminary approval for a proposed class action settlement with an anticipated value of $36 million.  This fall Compass Group USA Inc. and a retail technology company agreed to pay $6.8 million as part of a settlement to resolve claims alleging they collected fingerprint data from vending machine users without proper notice and consent as required under BIPA.   That was not the only BIPA settlement end of the year, as in October a federal court in Illinois granted preliminary approval to a $92 million settlement reached in the TikTok multidistrict litigation, over objections that had been raised in March concerning the basis and terms of settlement.

Article III Standing Continues to be a Strategic Pressure Point

As shown by the large number of BIPA cases decided by federal courts in the Seventh Circuit, defendants have shown a preference to remove BIPA litigations to federal court.  In response, plaintiffs this year sought in several cases to strategically limit their claims in an effort to avoid the imposition of Article III standing and preclude removal.  The foundation for this strategy was laid in 2020 and early 2021 with several rulings from the Seventh Circuit.

In Bryant v. Compass Group USA, Inc., the Seventh Circuit addressed standing to sue for two BIPA claims: (1) a violation of Section 15(b), the Act’s informed-consent provision; and (2) a violation of one part of Section 15(a)—namely, the duty to publicly disclose a data-retention policy.  The Court held that the plaintiff had standing to pursue the Section 15(b) claim.  However, the Court’s view of the Section 15(a) claim was different, as the plaintiff in Bryant had not alleged any concrete and particularized harm from the defendant’s failure to publicly disclose a data-retention policy.  As such, the Seventh Circuit held that the Bryant plaintiff lacked standing on that claim.  The Court cautioned, however, that its latter holding was confined to the narrow violation the plaintiff alleged (the Court did not address standing requirements for claims under other parts of Section 15(a)).

In Fox v. Dakkota Integrated Sys., the Court addressed this issue head on.  980 F.3d 1146 (7th Cir. 2020), The Fox Plaintiff made several claims under BIPA, including section 15(a), premised on the allegations that the defendant collected and disclosed plaintiff’s biometric identifiers without prior consent.  The plaintiff also alleged that the defendant failed to develop, publicly disclose, and implement a data retention schedule for destruction of employee biometric identifiers, and failed to destroy the plaintiff’s biometric data when she left the company.  The Court distinguished the “mere procedural failure” in Bryant when holding that the Fox Plaintiff had sufficiently alleged facts to satisfy Article III standing.  Specifically, the Court noted that the plaintiff “allege[d] a concrete and particularized invasion of her privacy interest in her biometric data stemming from [defendant’s] violation of the full panoply of its Section 15(a) duties [] resulting in the wrongful retention of her biometric data after her employment ended.”

In a January 2021 decision the Seventh Circuit further acknowledged that Section 15(c) BIPA claims (prohibiting entities from selling or otherwise profiting from biometric data) could also be pled to avoid Article III standing.  In holding the named plaintiffs lacked standing to litigate their claims in federal court, the Seventh Circuit observed that “[i]t is no secret to anyone that[plaintiffs] took care in their allegations, and especially in the scope of the proposed class they would like to represent, to steer clear of federal court. But in general, plaintiffs may do this.”

Some Attempts to Push BIPA Litigation Into Arbitration Rejected

Companies facing BIPA lawsuits have several lines of attack, including on grounds of personal jurisdiction, statute of limitations, constitutionality of the statute itself, preemption by other state/federal laws, and various statutory defenses.  And, some companies have able to avoid class actions by invoking arbitration clauses. This year, for example, an Illinois federal court set aside claims that Southwest Airline violated the BIPA by requiring employees to clock in and out by scanning their fingerprints, holding that employees had to pursue their claims as individuals in arbitration, not as a class in federal court.

However, not all efforts to compel arbitration were successful.  When these motions were denied in 2021, it was on the basis that the plain language of the agreement to arbitrate did not extend to the parties or claims involved in the underlying BIPA litigation.

Ambiguity Remains Over BIPA Damages Accrual, But Clarity Provided on Statute of Limitations

Notable BIPA litigations in 2021 addressed two critical issues under the statute: the applicable statute of limitations for BIPA claims and when claims accrue (when data regulated in the statute is collected in the first instance, or whether a defendant can commit reoccurring violations of the statute—such as whenever an employee clocks in or clocks out—with liquidated statutory damages available with each independent collection).

No overview of BIPA litigation in 2021 would be complete without Cothron v. White Castle, No. 20-3202 (7th Cir.).  Plaintiff had begun working at White Castle in 2004, and consented to the collection of her biometric data in 2007, after White Castle began using an optional finger-scan system for employees.  The employee brought suit 11 years later in 2018 for purported BIPA violations, alleging that White Castle had not obtained consent to collect or disclose her fingerprints at the first instance the collection occurred because BIPA did not exist in 2007—the law was enacted in 2008. Plaintiff alleged that each collection of her fingerprints was a separate BIPA violation.

Most recently, White Castle was appealed to the Seventh Circuit, which heard oral argument in September 2021.  On December 21, 2021, the Seventh Circuit certified the accrual question to the Illinois Supreme Court, finding that “[w]hether a claim accrues only once or repeatedly is an important and recurring question of Illinois law implicating state accrual principles as applied to this novel state statute.  It requires authoritative guidance that only the state’s highest court can provide.”

And on the statute of limitations front, in September a panel for the Illinois Court of Appeals addressed whether BIPA claims are potentially subject to a one-, two-, or five-year statute of limitations.  Tims v. Black Horse Carriers, Inc., 2021 IL App (1st) 200563 (Sep. 17, 2021).  The Court held Illinois Code Section 13-201 (the one-year limitations period) governs BIPA actions under Section 15(c) and (d) while Illinois Code Section 13-205 (the five-year limitations period) governs BIPA actions under Sections 15(a), (b), and (e).

BIPA Preemption Issues Continue

Another line of attack favored by defendants in BIPA litigation have been assertions of federal preemption.  Through 2021, defendants have explored a number of arguments that plaintiff’s claims were precluded by federal law.

Such was the case in Fleury v. Union Pac. R.R. Co., No. 20-cv-00390, 2021 U.S. Dist. LEXIS 55766 (N.D. Ill. Mar. 24, 2021), when the railroad moved to dismiss a truck driver’s lawsuit.  The truck driver claimed he was required to “scan” his biometric information when he visited the defendant’s facilities without his consent, in violation of BIPA.  The defendant answered suggesting that two federal statutes, addressing railroad safety and security, prevent state law from encroaching on the matter.  The court ruled that there was not yet enough information on the record to properly assess the argument, and denied the motion as premature.  In another preemption opinion this year, a federal court granted a motion to dismiss, finding that the plaintiff’s BIPA claims were preempted by the Labor Management Relations Act.   Barton v. Swan Surfaces, LLC, No. 20-cv-499, 2021 U.S. Dist. LEXIS 38464 (S.D. Ill. Mar. 2, 2021), The Court agreed with the defendant employer that the plaintiff’s BIPA claims would require the interpretation of the plaintiff’s Collective Bargaining Agreement.

AI-Based BIPA Cases Increase In Frequency In 2021

BIPA Fingerprint cases (both for timekeeping purposes and otherwise) continue to be the most frequent target in BIPA litigation.  However, in 2021 there was a developing trend with an increasing number of cases filed over a defendant’s use of AI technology.

Biometric identifiers under BIPA are “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.”  Although the statute itself does not define “scan of facial geometry” or “faceprint,” case law historically at least has treated these terms as referring to the measurements of distances between various facial features to generate a unique numerical representation of an individual face.  There were a number of cases filed this year where plaintiffs targeted AI algorithms that purportedly used facial recognition to enhance the customer experience.  By way of example, several beauty companies were sued over virtual makeup apps that allowed customers to “try on” products prior to purchase.  In these cases, should they survive past the pleadings stage liability under BIPA will hinge upon how the technology at issue functions and what data is collected and used.

Similarly, several “voiceprint” lawsuits were also filed under BIPA this year, including in the context of AI.  One notable putative class action was Carpenter v. McDonald’s Corporation, Case No. 1:21-cv-02906 (N.D. Ill.), which alleged that defendant McDonald’s had failed to comply with BIPA’s requirements in implementing a new AI voice assistant in its drive through locations. Most recently, Plaintiff’s BIPA claims were remanded to state court.

Other Legislative Developments to Key an Eye on in 2022

CPW regulars should find it no surprise that BIPA dominated the world of biometric data privacy litigation.  That said, 2021 was a significant year for biometric data, even outside of Illinois.

New York Biometric Data Laws

Although a number of states have made moves to enact biometric laws, new regulations and laws in New York were a standout in 2021.

In August 2021, the Tenant Data Privacy Act (“TDPA”) took effect, though the Act will not be enforceable until 2023.  Owners of “smart access buildings” are now required to obtain express consent to collect biometric data for use in the smart access systems.  The owner must also create a written privacy policy for the tenants that informs them of a number of aspects of the data collection.  On top of all this, the TDPA limits how the data can be retained or sold, placing substantial restrictions on the time the data may be stored, and all but eliminating disclosure to a third party without express written consent.  Perhaps most notably, the TDPA has a private right of action to ensure the building owner properly protects the users’ data, allowing individuals to bring suit against landlords who allegedly violate the TDPA.

Meanwhile, New York City also made an amendment to its Administrative Code, establishing new standards for commercial use of customer’s biometric data.  Any commercial establishment that collects, retains, converts, stores, or shares “biometric identifier information” must now erect clear and conspicuous notice of such at all customer entrances.  The establishments are also barred from profiting from the transaction of the information in any way.  As with the TDPA, this is enforced via a private right of action that could subject businesses to substantial penalties.

FTC Notice of Rulemaking

In December the FTC issued a notice (“Notice”) that it was “considering initiating a rulemaking under Section 18 of the FTC Act to curb lax security practices, limit privacy abuses, and ensure that algorithmic decision-making does not result in unlawful discrimination.”

There are a range of privacy, cybersecurity and AI issues that the FTC may seek to regulate as previewed by its Notice.  For instance, as seen in an April 2021 release the FTC has increasingly cautioned that AI may be utilized and “inadvertently introduce[e] bias or other unfair outcomes” to medicine, finance, business operations, media, and other sectors.  In addition, the FTC declared algorithmic and biometric bias as a focus of enforcement in resolutions passed this Fall.  The Notice builds upon this focus, with its reference to “unlawful discrimination” likely signaling rulemaking directed at AI.

Regardless of what 2022 brings, it will undoubtedly be another busy year in the realm of biometric litigation and enforcement.  Not to worry, CPW will be there to keep you informed every step of the way.  Stay tuned.

CPW previously covered a district court’s refusal to compel arbitration for litigation brought against a biometric software developer under Illinois Biometric Information Privacy Act (“BIPA”), finding that the relevant arbitration agreement did not cover the defendant.  Sosa v. Onfido, Inc., 2021 U.S. Dist. LEXIS 658 (N.D. Ill.).  Yesterday the Seventh Circuit affirmed the  ruling, agreeing with the district court “in all respects.”  Sosa v. Onfido, Inc., 2021 U.S. App. LEXIS 23816 (7th Cir. Aug. 11, 2021).  Read on to learn more and what it means going forward.

As readers will recall, the plaintiff in Sosa had an account with Offerup, Inc., a marketplace where people buy and sell goods online.  According to the pleadings, OfferUp partnered with the defendant, Onfido, to establish users’ identities.  Specially, the plaintiff alleged that users (including himself) upload their driver’s license or ID along with photos of their faces, and that Onfido’s software scans the images and extracts biometric identifiers in order to confirm if they match the uploaded IDs.  The plaintiff filed a putative class action complaint, alleging that Onfido violated BIPA by allegedly collecting and storing biometric information without obtaining written releases and providing certain required notices.

Onfido invoked the arbitration provision in OfferUp’s Terms of Service, which Onfido claimed the plaintiff  agreed to when he registered for OfferUp and each time he accessed his account. Ordinarily, as a matter of Illinois law, only signatories to an arbitration agreement can enforce it, but Onfido argued that three court-recognized exceptions to this rule applied: (1) third-party beneficiary, (2) equitable estoppel, and (3) agency.

The district court rejected each of Onfido’s nonparty contract enforcement theories and denied Onfido’s motion to compel individual arbitration.  Among other findings, the district court held that Onfido “failed to establish that it was a third-party beneficiary of the Terms of Service or that it could otherwise enforce the contract’s arbitration provision either as an agent of OfferUp or on equitable estoppel grounds.”

An appeal to the Seventh Circuit followed.

Assessing the district court’s refusal to compel arbitration de novo, the Seventh Circuit noted that Illinois courts recognize a “strong presumption against conferring contractual benefits on noncontracting third parties” and “[t]o overcome that presumption, ‘the implication that the contract applies to third parties must be so strong as to be practically an express declaration.’” (emphasis supplied) (quotation omitted).  Here, Onfido was not named in the Terms of Service nor did any other provision establish its status as a third-party beneficiary.  To the contrary, the Seventh Circuit held, the Terms of Service explicitly state “that the contract creates no ‘private right of action on the part of any third party.’”

Nor did any of Onfido’s other arguments pass muster.  First, Onfido’s agency theory supporting arbitration failed as the Seventh Circuit ruled “that OfferUp encouraged users to register their identities with the app’s TruYou feature and that Onfido and OfferUp partnered to provide this technology through the app establishes nothing more than a business relationship between the parties—not agency.” (emphasis supplied).  Second, nor did the Seventh Circuit find any equitable considerations supporting adoption of Onfido’s position.

This ruling confirms that plaintiff’s BIPA claims against this ID-verifying software developer will be resolved in federal court, not arbitration.  The case also offers a cautionary note that while an arbitration agreement can defeat a data privacy litigation, such provisions must be carefully drafted to cover anticipated claims and disputes.  For more on this, stay tuned.  CPW will be there to keep you in the loop

At this point, readers of CPW are familiar with the Clearview Illinois Biometric Information Privacy Act (“BIPA”) litigation.  The case raises novel data privacy and constitutional issues, as underscored by a recent development in the case.

Clearview previously moved to dismiss Plaintiffs’ claims under BIPA and various other states’ laws.  Among other arguments, Clearview claimed that Plaintiffs were improperly attempting to apply BIPA to Clearview’s out-of-state conduct in violation of Illinois’ extraterritoriality doctrine (which requires that the conduct at issue occurred “primarily and substantially” in the state).  This standard was plainly not satisfied here, Clearview argued, as none of the conduct relevant to Plaintiffs’ claims occurred in Illinois, and therefore the litigation should be dismissed.  Clearview also argued if BIPA applied to Clearview’s conduct, then BIPA would violate the dormant Commerce Clause of the U.S. Constitution, which precludes the application of a state statute that has the effect of regulating conduct in another state.

Besides these challenges, Clearview asserted that Plaintiffs’ claims are barred by the First Amendment and Article One Section Four of the Illinois Constitution.  According to Clearview, this is because both protect the creation and dissemination of information—which includes the collection and use of public photographs that appear on the Internet.  Besides these constitutional challenges, Clearview also argued that Plaintiffs failed to plead a cognizable BIPA claim under Section 15(c) of the statute (to be discussed on this blog another day).

Plaintiffs have opposed Clearview’s motion.  Last week, several consumer privacy groups weighed in, seeking leave to file amicus briefs supporting Plaintiffs—including the Electronic Frontier Foundation (“EFF”) and the Center on Privacy & Technology at Georgetown Law (“Center”).  Unsurprisingly, these groups have contrary views as to BIPA and whether it passes constitutional muster.  For example, as recently argued by the Center, BIPA is a content-neutral law that protects against the harm facial recognition technology poses to Illinois residents’ rights to privacy and free expression.  This includes in relation to protecting residents from police misuse of facial recognition technology [Note: Remember Clearview’s customers?]

As more states (like New York) pass biometric laws, similar arguments are going to be raised in future data privacy litigations.  Although how the court rules regarding BIPA in the context of the Clearview litigation will not be dispositive for these cases, it will provide a useful metric for predicting the direction of the law on this topic.  Not to worry-CPW will be there every step of the way to keep you in the loop.  Stay tuned.

Many of the litigations that CPW has previously covered involving Illinois’ Biometric Information Privacy Act (“BIPA”) have turned on issues with parties that have directly used biometric technology to collect and store personal information.  These parties are often employers collecting information about their employees, such as having employees scan fingerprints to clock in and out.  In case you need a refresher, check out some of CPW’s prior posts here and here.  But what about the manufacturers of those biometric technologies?  Three recent Illinois federal and state lawsuits illustrate potential litigation risks for third party vendors under BIPA.  Read on below.

First, a refresher for you BIPA novices out there.  BIPA was enacted in 2008.  It requires, among other things, that:

  • A private entity must establish and make publicly available a protocol for retaining and handling biometric data.
  • A private entity must first inform the subject in writing about the purpose of collecting the data, how long the data will be kept, and obtain consent of the subject.
  • This data must be destroyed: (1) when the initial purpose for collecting or obtaining such identifiers or information has been satisfied or (2) within 3 years of the individual’s last interaction with the private entity (whichever occurs first).
  • Sales, leases, trades, or further actions in which a private entity may profit from a person’s biometric information are strictly prohibited while disclosures, redisclosures, or other dissemination of a person’s biometric information are statutorily limited.
  • Finally, private entities must protect biometric information from disclosure using “the reasonable standard of care within the private entity’s industry . . . . [and] in a manner that is the same as or more protective than the manner in which the private entity stores, transmits, and protects other confidential and sensitive information.”

With that in mind, lets turn to the case law.

Figueroa v. Kronos Inc., 454 F. Supp. 3d 772 (N.D. Ill. 2020) demonstrates a successful claim by plaintiffs against a third party vendor for violations of BIPA.  Plaintiffs brought a putative class action against Kronos, the timekeeping company used by their employer, for violations of sections 15(a), (b), and (d) of BIPA.[1]  Kronos brought a 12(b)(6) motion to dismiss and moved in the alternative to strike plaintiffs’ class allegations, but the court denied both motions.  While Kronos had argued that Section 15(b) delegated the applicable notice and consent obligations for obtaining individuals’ biometric information to plaintiffs’ employer, the court found that Kronos was still a “private entity” as defined by BIPA, and had to comply with the same obligations.  The court also found that plaintiffs had sufficiently pled under BIPA Section 15(d) that Kronos had disseminated their data to third parties that hosted the biometric data in Kronos’ data centers.

On the other hand, in Bray v. Lathem Time Co., No. 19-3157, 2020 U.S. Dist. LEXIS 53419 (C.D. Ill. March 27, 2020), a third party vendor prevailed in having a putative class action dismissed.  Similar to Figueroa, plaintiff had sued the timekeeping company his employer used for violations of Sections 15(a), (b), and (d) of BIPA.  Lathem moved to dismiss for failure to state a claim and lack of personal jurisdiction.  Ruling on Lathem’s motion, the court found that it lacked personal jurisdiction over Lathem.  It agreed that Lathem itself had not created sufficient “minimum contacts” with Illinois, because Lathem was a Georgia-based company that had no corporate presence in Illinois and had not targeted Illinois or made any direct sales there.  On this basis, Lathem successfully argued that any contacts it had with Illinois were the results of decisions made by its customers – the employers who had chosen to use its timekeeping software – and that it could not be subject to personal jurisdiction in Illinois based on its customers’ decisions.  Lathem had also separately argued that BIPA was not intended to apply to third party vendors like itself, and only provided a cause of action against plaintiff’s employer, but the court did not address this argument.

The jury is still out on one final state court action against a third party vendor.  In Bernal v. ADP, No. 2017-CH-12364, 2019 Ill. Cir. LEXIS 1025 (Ill. Cir. Ct. Cook Cty. Aug. 23, 2019), plaintiff had initially brought suit against his employer alleging violations of Sections 15(a)-(d) of BIPA, but amended the complaint to bring suit only against ADP, the entity which provided the biometric scanning technology his employer used to clock employees in and out.  ADP was successful in having plaintiff’s first complaint dismissed for failure to state a claim, as the court found that the complaint did not contain sufficient factual allegations for any of the alleged violations of BIPA.  For example, the court found that plaintiff had not sufficiently alleged violations of Section 15(d) because it raised only conclusory allegations that ADP’s technology allowed for and resulted in the dissemination of biometric information to third parties.  Similar to Bray, ADP also argued that Section 15(b) should not apply to third party entities like itself.  The court declined to rule on this argument, as it found that the complaint did not contain sufficient factual allegations to show ADP’s involvement in the actions plaintiff alleged, apart from supplying plaintiff’s employer with the technology used.  The court, however, granted plaintiff leave to file an amended complaint, which he did, and the litigation is ongoing.

While suits against employers are still the most prominent BIPA trend, vendors manufacturing biometric technology or software are not without risk.  As demonstrated by these cases, the most important inquiry for BIPA suits against third party vendors will likely be jurisdictional (whether the court can actually exercise personal jurisdiction over the vendor).  Courts will also likely continue to face the question of whether BIPA is intended to provide a private right of action against third party vendors or solely against the parties employing vendors’ software or technologies.  For more on this developing area of the law, stay tuned.  CPW will be there.

[1] Section (a) of BIPA provides that a private entity possessing biometric identifiers or information must have a written policy that is made available to the public, including a retention schedule and guidelines for destroying biometric information.  Section (b) gives guidelines for collecting, capturing, or receiving biometric information, and Section (d) requires that an entity obtain individuals’ consent before disclosing or disseminating biometric data.

The Illinois Biometric Information Privacy Act (“BIPA”) regulates the collection and use of biometric data and includes a private right of action as an enforcement mechanism.  This month, a BIPA class action lawsuit was filed against Del Monte Foods, Inc. in the Circuit Court of Cook County, Illinois.  In Metoyer v. Del Monte Foods, Inc., No. 2020CH06697, the plaintiff alleges several violations of the BIPA related to his former employer’s timekeeping practices.  This case reaffirms that employers must ensure that their practices concerning the collection and use of employees’ biometric information comports with BIPA – as employers who fail to do so are finding their practices challenged in court (for CPW’s prior coverage of this issue, see here and here).

At its core, BIPA protects the “biometric information” of Illinois residents, which is any information based on “biometric identifiers” that identifies a specific person—regardless of how it is captured, converted, stored, or shared.  740 ILCS 14/10.  Biometric identifiers are “a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.”  Id. (collectively, with “biometric information,” “biometric data”).

Employers are increasing using fingerprint scans for timekeeping purposes (which would fall within the ambit of BIPA).  However, the plaintiff in Metoyer, a former Del Monte Foods employee, alleges that his former employer instead “required that employees have face(s) scanned by a biometric timekeeping device.”  While an employee at Del Monte Foods, the plaintiff alleges that he was required to submit to facial scanning whenever he clocked in and clocked out.

Although the use of facial scans are regulated under BIPA, the plaintiff alleges that Del Monte Foods failed to conform its conduct to the law’s requirements.  This included the following purported deficiencies:

  • Properly inform Plaintiff and others similarly situated in writing of the specific purpose and length of time for which their face scan(s) were being collected, stored, disseminated and used;
  • Provide a publicly available retention schedule and guidelines for permanently destroying Plaintiff’s and other similarly-situated individuals’ face scan(s);
  • Receive a written release from Plaintiff and others similarly situated to collect, store, disseminate or otherwise use their face scan(s); and
  • Obtain consent from Plaintiff and others similarly situated to disclose, redisclose, or otherwise disseminate their biometric identifiers and/or biometric information to a third party.

The complaint pleads three counts against Del Monte Foods under BIPA in regards to these alleged practices.  It also seeks to certify a class consisting of “[a]ll persons who were enrolled in the biometric timekeeping system and subsequently used a biometric timeclock while employed/working for Defendant in Illinois during the applicable statutory period.”

Noting that biometrics are unique, permanent identifies, the complaint alleges that “[n]o amount of time or money can compensate Plaintiff if his biometric data is compromised by the lax procedures through which Defendant captured, stored, used, and disseminated Plaintiff’s and other similarly-situated individuals’ biometrics.”  Notwithstanding the alleged threat of irreparable future harm, the plaintiff additionally demands monetary damages consistent with BIPA’s statutory caps of $5,000 for each intentional and/or reckless violation and $1,000 for each negligent violation and other equitable relief.

Since BIPA’s enactment, the largest number of BIPA lawsuits have been filed against employers that collect their employees’ biometric data for timekeeping purposes.  Additionally, BIPA class actions have also recently been brought against non-employer third parties that manufacture and/or operate such timekeeping systems.  The complaint in Metoyer is consistent with these general trends.  According to the case docket, the litigation is scheduled for case management call on March 10, 2021.  Stay tuned.

 

Earlier this month CPW’s Kristin Bryan and Kyle Fath presented a webinar on “AI and Biometrics Privacy: Trends and Developments” with the International Association of Privacy Professionals (“IAPP”), the largest global community of privacy professionals.  A recording of that webinar is available to all IAPP members and available (for CPE credit) here.

As summarized in the program description on the IAPP website:

Artificial intelligence and biometrics privacy are top-of-mind issues for companies and their privacy professionals, regardless of the industry sector in which they operate. AI will soon be regulated in the U.S. in an unprecedented manner: The patchwork of 2023 state privacy laws imposes restrictions and obligations on organizations carrying out AI, profiling and automated decision-making processes, and the Federal Trade Commission is poised to promulgate regulations on automated decision-making and related topics. Organizations employing facial recognition and other biometrics technologies are under the constant threat of putative privacy class-action litigations under Illinois’ Biometric Information Privacy Act and a handful of other state laws. With BIPA copycats and similar legislation introduced across the country, and a lack of clarity in the current case law, the risk associated with biometrics will certainly continue, and likely increase. Needless to say, global developments in these areas add further complexity to organizations with international operations.

The program addresses, among others:

  • AI, biometrics and privacy compliance — Restrictions on and obligations under forthcoming privacy laws in California, Colorado, Utah and Virginia, including with respect to profiling, automated decision-making, and sensitive data.
  • AI and biometrics litigation overview — The current litigation landscape concerning AI and biometrics, including facial recognition.
  • Legislative and regulatory priorities — Pending and anticipated legislative and regulatory developments, both federal and state, as well as globally.

Kristin and Kyle are also covering on CPW key developments regarding AI and biometric privacy in the realm of regulation, compliance and litigation.  You can check out their analyses of these issues here, here and here, with contributions from David Oberly and other team members.

For more on this, stay tuned.  CPW will be there to keep you in the loop.

Last year, CPW covered a litigation win by Clarifai, Inc., a technology company specializing in artificial intelligence, when a federal court granted its motion to dismiss claims brought under Illinois’ Biometric Information Privacy Act (“BIPA”) in Stein v. Clarifai, Inc., No. 20 C 1937, 2021 U.S. Dist. LEXIS 49516 (N.D. Ill. Mar. 16, 2021). While the court’s findings dismissed a putative class action, the Federal Trade Commission (“the Commission”) had already opened an investigation into a data-sharing incident in 2014 that gave way to the litigation. Last month, the FTC took a big step to further that investigation regarding Match Group (“Match”), the parent company of the entity from whom Clarifai pulled facial data.

As a brief reminder, last year the Northern District of Illinois found that it lacked personal jurisdiction over Clarifai, which Plaintiff claimed had violated BIPA by harvesting facial data from dating site OkCupid without obtaining consent from the website’s users or making the necessary disclosures. The court found that Clarifai had only sold data to two customers in Illinois, which had resulted in only seven cents of revenue; these de minimis sales did not suffice to establish personal jurisdiction over Clarifai.

Although the civil litigation was dismissed last year, in 2020, the Commission issued a civil investigative demand (“CID”) to Match, the parent company of OkCupid, as part of its investigation into the 2014 data-sharing incident between OkCupid and Clarifai that gave rise to the litigation. The investigation, which remains ongoing, is aimed at determining whether “unnamed persons, partnerships, corporations, or others are engaged in, or may have engaged in, deceptive or unfair acts or practices related to consumer privacy and/or data security,” based on information indicating that Clarifai had obtained photos and user data from OkCupid and used the data in a face database Clarifai had built to train its facial recognition technology.

Last month, the Commission filed a petition in the U.S. District Court for the District of Columbia requesting compliance with the CID. While the petition acknowledges that Match has produced some responsive documents, the Commission claims that numerous other documents and communications related to the data-sharing incident were being withheld since 2020 based on “improper and overbroad assertions of attorney-client privilege and the work product doctrine.” The petition requests that Match produce the documents it has withheld, or, alternatively, that the Court conduct an in-camera review of the documents.

CPW will continue to keep an eye on this investigation, and how its resolution might impact similar inquiries into data privacy incidents, for you.