Earlier this month CPW’s Kristin Bryan and Kyle Fath presented a webinar on “AI and Biometrics Privacy: Trends and Developments” with the International Association of Privacy Professionals (“IAPP”), the largest global community of privacy professionals.  A recording of that webinar is available to all IAPP members and available (for CPE credit) here.

As summarized in the program description on the IAPP website:

Artificial intelligence and biometrics privacy are top-of-mind issues for companies and their privacy professionals, regardless of the industry sector in which they operate. AI will soon be regulated in the U.S. in an unprecedented manner: The patchwork of 2023 state privacy laws imposes restrictions and obligations on organizations carrying out AI, profiling and automated decision-making processes, and the Federal Trade Commission is poised to promulgate regulations on automated decision-making and related topics. Organizations employing facial recognition and other biometrics technologies are under the constant threat of putative privacy class-action litigations under Illinois’ Biometric Information Privacy Act and a handful of other state laws. With BIPA copycats and similar legislation introduced across the country, and a lack of clarity in the current case law, the risk associated with biometrics will certainly continue, and likely increase. Needless to say, global developments in these areas add further complexity to organizations with international operations.

The program addresses, among others:

  • AI, biometrics and privacy compliance — Restrictions on and obligations under forthcoming privacy laws in California, Colorado, Utah and Virginia, including with respect to profiling, automated decision-making, and sensitive data.
  • AI and biometrics litigation overview — The current litigation landscape concerning AI and biometrics, including facial recognition.
  • Legislative and regulatory priorities — Pending and anticipated legislative and regulatory developments, both federal and state, as well as globally.

Kristin and Kyle are also covering on CPW key developments regarding AI and biometric privacy in the realm of regulation, compliance and litigation.  You can check out their analyses of these issues here, here and here, with contributions from David Oberly and other team members.

For more on this, stay tuned.  CPW will be there to keep you in the loop.

Last year, CPW covered a litigation win by Clarifai, Inc., a technology company specializing in artificial intelligence, when a federal court granted its motion to dismiss claims brought under Illinois’ Biometric Information Privacy Act (“BIPA”) in Stein v. Clarifai, Inc., No. 20 C 1937, 2021 U.S. Dist. LEXIS 49516 (N.D. Ill. Mar. 16, 2021). While the court’s findings dismissed a putative class action, the Federal Trade Commission (“the Commission”) had already opened an investigation into a data-sharing incident in 2014 that gave way to the litigation. Last month, the FTC took a big step to further that investigation regarding Match Group (“Match”), the parent company of the entity from whom Clarifai pulled facial data.

As a brief reminder, last year the Northern District of Illinois found that it lacked personal jurisdiction over Clarifai, which Plaintiff claimed had violated BIPA by harvesting facial data from dating site OkCupid without obtaining consent from the website’s users or making the necessary disclosures. The court found that Clarifai had only sold data to two customers in Illinois, which had resulted in only seven cents of revenue; these de minimis sales did not suffice to establish personal jurisdiction over Clarifai.

Although the civil litigation was dismissed last year, in 2020, the Commission issued a civil investigative demand (“CID”) to Match, the parent company of OkCupid, as part of its investigation into the 2014 data-sharing incident between OkCupid and Clarifai that gave rise to the litigation. The investigation, which remains ongoing, is aimed at determining whether “unnamed persons, partnerships, corporations, or others are engaged in, or may have engaged in, deceptive or unfair acts or practices related to consumer privacy and/or data security,” based on information indicating that Clarifai had obtained photos and user data from OkCupid and used the data in a face database Clarifai had built to train its facial recognition technology.

Last month, the Commission filed a petition in the U.S. District Court for the District of Columbia requesting compliance with the CID. While the petition acknowledges that Match has produced some responsive documents, the Commission claims that numerous other documents and communications related to the data-sharing incident were being withheld since 2020 based on “improper and overbroad assertions of attorney-client privilege and the work product doctrine.” The petition requests that Match produce the documents it has withheld, or, alternatively, that the Court conduct an in-camera review of the documents.

CPW will continue to keep an eye on this investigation, and how its resolution might impact similar inquiries into data privacy incidents, for you.

A reminder that tomorrow, in a new IAPP web conference on Thursday, June 2, 2022 at 11 a.m. EST, data privacy thought leaders Kyle Fath and Kristin Bryan will take a look at key developments and trends in the developing areas of artificial intelligence (AI) and biometrics. During the session, our subject matter specialists will discuss:

  • AI, biometrics, and privacy compliance – Restrictions on and obligations under forthcoming privacy laws in California, Virginia, Colorado and Utah, including with respect to profiling, automated decision-making and sensitive data.
  • AI and biometrics litigation overview – The current litigation landscape concerning AI and biometrics (including facial recognition).
  • Legislative and regulatory priorities – Pending and anticipated legislative and regulatory developments, both federal and state, as well as global.

The IAPP is the largest and most comprehensive global information privacy community and resource that helps define, promote and improve the privacy profession globally.

Click here to register.

This week, Lexis Practical Guidance released new, timely biometric privacy compliance guidance materials—its “Mitigating Legal Risks When Using Biometric Technologies” Practice Notice and Biometric Privacy Compliance Checklist—both of which were authored by CPW’s own David Oberly.

You can view David’s “Mitigating Legal Risks When Using Biometric Technologies” Practice Note by clicking here and his Biometric Privacy Compliance Checklist by clicking here.

In a new IAPP web conference on Thursday, June 2, 2022 at 11 a.m. EST, data privacy thought leaders Kyle Fath and Kristin Bryan will take a look at key developments and trends in the developing areas of artificial intelligence (AI) and biometrics. During the session, our subject matter specialists will discuss:

  • AI, biometrics, and privacy compliance – Restrictions on and obligations under forthcoming privacy laws in California, Virginia, Colorado and Utah, including with respect to profiling, automated decision-making and sensitive data.
  • AI and biometrics litigation overview – The current litigation landscape concerning AI and biometrics (including facial recognition).
  • Legislative and regulatory priorities – Pending and anticipated legislative and regulatory developments, both federal and state, as well as global.

The IAPP is the largest and most comprehensive global information privacy community and resource that helps define, promote and improve the privacy profession globally.

Click here to register.

This week members of the CPW team, including subject matter experts Kristin Bryan, Kyle Fath, David Oberly, offered insights into trends across the biometric privacy and artificial intelligence landscape.   They also addressed what has transpired in 2022 in this space and what may be on the horizon.  This included:

  • AI and privacy compliance – An overview of restrictions on and obligations with respect to AI, profiling and other automated decision-making processes under forthcoming privacy laws in California, Virginia, Colorado and Utah.
  • AI and biometrics litigation overview – An overview of the current litigation landscape concerning biometric data and AI, as well as related insights.
  • State legislative priorities – Approaches states are taking to the use of facial recognition technology.
  • Anticipated federal developments – Proposed federal legislation concerning biometrics, AI and other anticipated developments in 2022.

If you were not able to attend the live webinar, you can check out the recording here.   We welcome your questions and interest – for any follow up, do not hesitate to reach out to any member of our team directly.  And be sure to check out the CPW Team’s 2022 Q1 AI/Biometric Litigation Trends from last week, which is available here.

What it means to “collect” or “possess” biometric data has developed into one of the fiercest battlegrounds between plaintiffs and defendants in bet-the-company Illinois Biometric Information Privacy Act (BIPA) class action litigation. The issue is a significant one in BIPA litigation—and oftentimes determinative in terms of whether a private entity can be held liable for violations of Illinois’ stringent biometric privacy statute. CPW’s David Oberly examined one of the first BIPA opinions to date to rule on the issue of biometric data collection in the context of summary judgment in this Law360 piece, Biometric Data Collection Takeaways From BNSF Ruling (Law360).

With the first quarter of 2022 at a close, litigation involving the collection and protection of biometric data has taken off to a hot start, setting a fervent pace that could mean big things for data privacy litigation for 2022 (with crossover impact on data breach and cybersecurity litigations, as outlined below).  Read on to see what trends CPW has seen, and which topics we will be keeping our eyes on as the year continues.  For more information, be sure to register for our webinar on April 5 from 12-1 pm on “Developments and Trends Concerning Biometric Privacy and Artificial Intelligence.”

I.     New Biometric Privacy Cases Filed in Q1 2022

At time of writing, more than one hundred and ten cases have been filed related to biometric data privacy.  It should come as no surprise to regular CPW readers to learn that nearly all of these cases were prospective class action claims filed in Illinois alleging damages under the Biometric Information Privacy Act (“BIPA”).  For those new to CPW, BIPA is a state statute that provides state citizens with a private cause of action if their biometric information has been collected or shared without their informed consent.

Some quick statistics about these BIPA cases:

  • The majority, more than sixty-five cases, involved claims resulting from fingerprints captured by timekeeping machines by the plaintiffs’ employer.
  • Twenty-five of these litigations involved allegations that the fingerprints were collected without the plaintiff’s knowledge or consent, while nineteen complaints alleged that the employer failed to provide the plaintiffs with information relevant to the recording and retention of the information. Additionally, thirteen litigations filed sought damages alleging that plaintiff’s employer failed to safeguard the data from third parties and/or hackers.  Finally, eight plaintiffs simply alleged that the employer never obtained written consent as required under the statute.
  • Eleven litigations were filed concerning allegations that the defendant had obtained the plaintiffs’ facial geometry without knowledge or informed consent, or without safeguarding the information from third parties—a growing area of BIPA litigation, consistent with prior trends.
  • Moreover, ten cases concerned claims involving the collection of voice recognition data—another growing area of potential litigation risk.

II.   Biometric Privacy Cases to Watch in 2022

CPW has identified a number of biometric cases as ones to keep an eye on as the year progresses.  This includes:

Stein v. Clarifai, Inc., No. 22 CV 314 (D. Del.): After winning dismissal of a BIPA class action filed in Illinois on personal jurisdiction grounds (covered by SPB team member David Oberly for Bloomberg Law here), AI software developer Clarifi found itself hauled into court once again—this time in Delaware—for purportedly running afoul of Illinois’s biometric privacy statute.  In that case, Stein v. Clarifai, Inc., Clarifai—which specializes in machine learning to identify and analyze images and videos using facial recognition technology—improperly harvested facial template data from OkCupid dating profile photos without providing notice or obtaining consent.  If this procedural posture seems familiar to some, that is because it parallels another recent BIPA class action involving a cloud-based call center entity and its integrated voiceprinting technology provider—which was also refiled in Delaware after being dismissed in Illinois due to an absence of personal jurisdiction.  The plaintiffs in the earlier voiceprint class action fared no better the second time around, with a Delaware federal court dismissing the re-filed suit based on a successful extraterritoriality challenge.  Only time will tell if the Clarifi suit will be able to avoid the same fate.

Roberts v. Cooler Screens Inc., No. 2022-CH-0184 (Ill. Cir. Ct. Cook Cnty.):  In another recently-filed case, Roberts v. Cooler Screens Inc.,  Cooler Screens’s “Smart Coolers” have been targeted for allegedly improper biometric data collection practices that purportedly violate BIPA.  “Smart Coolers” replace refrigerator cases in retail stores, replacing the doors with digital screens that provide an “interactive experience” to customers.  This experience, according to the complaint, includes a “facial profiling system” that “detect[s] the age, gender, and emotional response of over 3 million verified daily viewers.”    The facial recognition system analyzes each customer, determining which advertisements and suggestions are most likely to lead to a purchase.  While some might view this as an exciting potential advertising opportunity, the plaintiff saw otherwise.  The technology in the litigation was characterized as an unlawful collection of biometric data, and a violation of BIPA’s requirements to provide information and obtain consent.  This will be worth watching, as the overlapping space between developing technology and efforts to ensure the privacy of biometric data is likely to lead to further litigation in the near future.

Copple v. Arthur J Gallagher & Co., No. 22 CV 116 (W.D. Wash.): Outside of BIPA claims, some litigants have alleged harms emerging from biometric data in other contexts.  In Copple v. Arthur J Gallagher & Co., a ransomware attack has resulted in a prospective class-action lawsuit filed against the defendant, “one of the leading insurance brokerage, risk management, and HR & benefits consulting companies in the world.”  The plaintiffs in this action allege that a number of the defendant’s clients provided the defendant with the plaintiffs’ personally identifiable information (“PII”) and protected health information (“PHI”), without the plaintiffs’ knowledge or consent.  According to the complaint, the defendant was struck by a cyberattack beginning in June 2020, only discovering the attack on September 26, 2020.  The company allegedly did not begin notifying plaintiffs of the breach, however, until more than nine months later, in July 2021.  Over the next six months, the company provided almost weekly reports to the state Attorney General, which included an increasing number of individuals affected, beginning with only 1,825 Washington residents in its initial July 13, 2021 report, and cumulating in 72,835 affected by December 6, 2021.  Plaintiffs seek damages claiming that the PII and PHI are likely to appear on the dark web, and the class members were harmed by the significant delay in notifying the affected class members.

III.     Notable 2022 Trends in Biometric Privacy Litigation

From a broader perspective, there are several areas of activity in BIPA class action litigation that are worth keeping an eye on as we head into the second quarter of 2022.

Voiceprints, Take II: One noteworthy trend that has developed since the start of the year is an increased volume of BIPA class action filings targeting voice biometric technologies.  Voice biometrics (also known as a “voiceprint”) relies on the analysis of unique voice patterns to identify or verify individuals’ identities.  In other words, this is the use of biological characteristics—one’s voice—to verify an individual’s identity.  Voiceprints can be distinguished from general voice data, which merely captures a person’s voice without analyzing the components of the voice and/or generating a voiceprint for the purpose of verification or identification.  While voiceprints fall under BIPA’s scope, courts have held that general voice data does not, with the important dividing line being the identifying quality of the identifier or other biometric information.

In mid-2021, a wave of lawsuits was filed targeting voice-powered technologies—including a high-profile suit involving McDonalds’ drive-thru voice assistants, which SPB team member Kristin Bryan covered extensively in CPW articles here, here, and here—the majority of this litigation fell flat because the technology at issue ultimately did not involve voiceprints, but rather tech that merely captured or used individuals’ voice data.  It appears that enterprising plaintiffs’ attorneys have again turned their attention to voice data in 2022, with one main difference.  This time around, these BIPA class actions are focusing narrowly on voice data that is used specifically for time and attendance purposes.  Because timekeeping necessarily involves the verification of individuals’ identities, there is a reasonable likelihood that this round of filings may be different than 2021, where the majority of suits were dismissed within a short period of time after they were filed.

Additional Uses of Facial Recognition: Similarly, there has also been a wave of new BIPA filings focused on targeting timekeeping systems that utilize facial recognition software.  While facial biometrics has long been one of the most popular targets for BIPA class actions, in the timekeeping context these actions have traditionally been confined to the use of fingerprint time and attendance systems.  That is no longer the case in 2022.

Facial Recognition Cameras Used for Vehicle Monitoring: Facial recognition-powered cameras used to monitor vehicle fleets and their drivers has also emerged as a new favorite target for BIPA class actions.  Transportation companies are increasingly relying on facial recognition cameras to analyze video collected from cameras mounted on the interior windshields of vehicles in their fleets to monitor driver activity and protect these companies against losses from vehicle accidents.  The AI technology that is used by these facial recognition cameras allows for the monitoring of external variables such as cars and road signs.  More importantly, this AI tech allows the devices to continually monitor and classify their drivers’ status, including whether they are being attentive at the wheel.

According to recently-filed suits, these cameras also collect drivers’ facial data and analyze it to detect certain types of driver behavior, like distracted or drowsy driving, then uses a built-in cellular data link to upload the video, biometrics, and other data to the company’s servers.  While these suits allege that these cameras scan drivers’ facial geometry—which, if true, would bring these cameras within the scope of BIPA—it is uncertain whether this technology actually satisfies the definition of “facial recognition” under the law.  Importantly, this trend illustrates the complex compliance decisions that arise when attempting to mitigate BIPA liability exposure in connection with new and advanced technologies where courts have not clearly addressed  whether they fall under the scope of Illinois’ biometric privacy law—and the need to consult with experienced biometric privacy counsel before rolling out any new type of biometric- or AI-related technology to ensure legal risks are addressed to the greatest extent possible.

IV.   Recent Significant Decisions in Biometric and AI Privacy Litigation

Several significant decisions concerning biometric and AI litigation have been handed down in Q1 2022.  Below, we highlight a few of these decisions as potential trends for 2022 litigation.

BIPA & Personal Jurisdiction:  As noted above, BIPA provides a cause of action for Illinois residents who believe their biometric information has been obtained or disclosed without consent.  One recent decision confirmed, though, that even suits brought by Illinois residents do not automatically signify that there is personal jurisdiction, and a plaintiff or putative class members may not be a sufficient connection to Illinois.

In Gutierrez v. Wemagine.ai LLP, 2022 U.S. Dist. LEXIS 14831 (N.D. Ill. Jan. 26, 2022), Plaintiffs claimed that defendant’s app obtained and disseminated the biometric information of its users without their written consent in violation of BIPA, but defendant was a Canadian company and its only contacts with IL were app downloads in the state.  Defendant moved to dismiss for lack of personal jurisdiction, which the court granted, finding that defendant had not “targeted” the forum of Illinois (such as through marketing or sales).  While this is consistent with what many other courts have found with respect to personal jurisdiction, it sets an important precedent in the BIPA context that, without more, a plaintiff and/or putative class members are not a sufficient connection to Illinois for the purposes of BIPA – personal jurisdiction must still be proper and comport with due process.  It is also a reminder to entities sued under BIPA to thoroughly examine whether personal jurisdiction is proper and to emphatically litigate the issue if it is not.  This may explain the choice of venue in Stein v. Clarifai, Inc., and may suggest that more BIPA claims will be filed in out-of-state courts going forward.

BIPA Preemption: In a continuation from a 2021 trend, one often-raised defense to a BIPA suit is that the BIPA claims are preempted by a federal or Illinois state statute.  Three recent decisions continue to demonstrate the strength of this complete liability defense in BIPA litigation.

Federal Litigation: Just recently, an Illinois federal court in Kislov v. Am. Airlines, Inc., No. 17 CV 9080, 2022 U.S. Dist. LEXIS 50481 (N.D. Ill. Mar. 22, 2022), dismissed a BIPA class action against airline giant American Airlines arising out of the company’s use of integrated voice response (“IVR”) software into its customer service hotline.  IRV is the “robot voice” that a caller hears when calling a customer support hotline.  Of note, the software also collects, stores, and analyzes callers’ voiceprints to understand and predict callers’ requests and track interactions with callers over time.  According to the plaintiffs, American Airlines deployed this voiceprint technology without providing customers notice or obtaining their consent in violation of BIPA. The airline moved to dismiss the action, arguing that the suit was preempted by the Airline Deregulation Act (“ADA”).  The court agreed, finding that American Airlines’ use of the IVR software was covered under the ADA’s preemption provision because it concerned the services the airline provided to customers.

Kislov is by no means the first BIPA action to be dismissed based on a successful preemption challenge.  Federal courts have dismissed a number of BIPA class actions on preemption grounds under the Railway Labor Act (“RLA”) and § 301 of the Labor Management Relations Act (“LMRA”).  Significantly, however, both the RLA and the LMRA apply in the context of unionized employment relationships.  Kislov is noteworthy because it demonstrates that the preemption defense is not limited to employers in BIPA litigation, but can also be deployed in a much broader range of contexts, including to defeat biometric privacy class actions filed by customers or other consumers.

State: An Illinois appellate court recently confirmed that federal law may preempt BIPA in certain circumstances.  In Walton v. Roosevelt Univ., 2022 Ill. App. LEXIS 83 (Ill. Ct. App. Feb. 22, 2022), Plaintiff, who belonged to a union, filed suit seeking damages from his employer for alleged BIPA violations, including collection, storage, use, and dissemination of his biometric data, as well as disclosure to a third party payroll service.  Defendant employer moved to dismiss on the grounds that Plaintiff’s claims were preempted because he was covered by a collective bargaining agreement, and the Court agreed.  It found that, while Plaintiff and other members of the putative class who were unionized employees were not prohibited from seeking redress, the collective bargaining agreement required them to seek it through the grievance procedures laid out in the agreement.  Subsequently, the Court expressly found that BIPA “claims asserted by bargaining unit employees covered by a collective bargaining agreement are preempted under federal law.”

On the other hand, a landmark ruling from the Illinois Supreme Court held that BIPA was not preempted by the Illinois Workers’ Compensation Act.  In McDonald v. Symphony Bronzeville Park, 2022 Ill. LEXIS 194 (Ill. Feb. 3, 2022), the Illinois Supreme Court found that the Illinois Workers’ Compensation Act does not preempt BIPA claims, and, as such, an employer may be subject to liability for damages claims under BIPA as well as liability under the workers’ compensation framework.  Employees of a nursing home had sued, alleging violations of their rights under BIPA based on the employer’s practice of scanning employees’ fingerprints as a means of timekeeping.  The employer argued that employees could not seek damages under BIPA because the violation had occurred at work, and so plaintiffs’ exclusive remedy was to seek compensation under the Workers’ Compensation Act.  The Illinois Supreme Court disagreed, finding that violations of BIPA are not a “compensable injury” under the state Workers’ Compensation Act because whether an injury is compensable depends both on where the injury occurred and the nature of the injury, and other compensable injuries under the statute differed greatly from the “personal and societal injuries” under BIPA.

With the first quarter just about wrapped up, 2022 is promising to be an exciting year in AI and biometric data litigation.  As the year continues, be sure to stay tuned; CPW will be your go-to source to stay on the forefront of all new developments in real time.

Data privacy is a top-of-mind issue in 2022, and biometric privacy and issues relating to artificial intelligence (AI) have been subject to recent scrutiny from state and federal government officials and legislators. These topics also continue to be areas of focus in the realm of putative privacy class action litigations.

Partners Kristin Bryan and Kyle Fath, as well as senior associate David Oberly, will provide an overview of key developments and trends in this developing area of the law. This will include, among other matters:

  • AI and privacy compliance – An overview of restrictions on and obligations with respect to AI, profiling and other automated decision-making processes under forthcoming privacy laws in California, Virginia, Colorado and Utah.
  • AI and biometrics litigation overview – An overview of the current litigation landscape concerning biometric data and AI, as well as related insights.
  • State legislative priorities – Approaches states are taking to the use of facial recognition technology.
  • Anticipated federal developments – Proposed federal legislation concerning biometrics, AI and other anticipated developments in 2022.

CLE is pending in the following jurisdictions: AZ, CA, NJ, NY, OH and TX.  Registration is available here.

CPW’s David Oberly recently wrote a guest analysis at Law360 concerning Maryland HB 259, the Biometric Identifiers Privacy Act.  As he addresses at Law360, this bill comes in the wake of failed efforts in the state to enact biometric privacy legislation.  HB 259 would require:

  • Publicly available privacy policies containing written retention schedules and guidelines for the permanent destruction of biometric data;
  • Written consent obtained before the time any biometric data is collected;
  • Reasonable security measures to safeguard biometric data; and
  • A prohibition on selling, leasing or otherwise profiting from individuals’ biometric data.

As used in the bill, “biometric identifier” means “the data of an individual generated by automated measurements of an individual’s unique biological characteristics.” This would include, but not be limited to: faceprints, but excludes photographs or video, or a physical description (e.g., height, weight, hair color, eye color, tattoo description, etc.).

David commented to CPW that “Maryland’s HB 259 was recently followed up by Maine’s legislature, which also introduced its own “hybrid” biometric privacy bill (LD 1459) integrating a mix of concepts traditionally associated with both biometric privacy and consumer privacy statutes.  Thus, a trend is emerging with biometric privacy bills that mirror BIPA in many ways, but which also differ from current biometric privacy statutes in many respects as well.  If enacted, these hybrid statutes will make the task of compliance for companies that utilize biometric in their operations much more difficult and complex.”

Additionally, HB 259 also contains a private right of action with liquidated statutory damages.  It provides that a private entity that collects biometric identifiers may not collect, use, disclose, redisclose, or otherwise disseminate an individual’s biometric identifiers, unless, among others, it has the individual’s (or the individual’s representative’s) written consent.  The bill would allow for the recovery of $1,000 or actual damages per violation, whichever is greater, for negligence; or $5,000 or actual damages, whichever is greater, for intentional or reckless violations.  A private plaintiff would also be able to recover reasonable attorney’s fees and costs (including expert witness fees and litigation expenses), and obtain other relief, including an injunction.

CPW’s Kristin Bryan, a data privacy and cybersecurity litigator who also advises AI companies on proposed biometrics laws commented that “HB 259 is one of several biometric privacy laws under consideration this year that would incorporate a private right of action along with significant statutory liquidated damages.  If other states proceed as Illinois did in 2008 in enacting a biometric privacy law it would be anticipated to trigger a sea change in the data privacy landscape.”

For more on this, stay tuned.  CPW will be there to keep you in the loop.