On May 18, 2023, the Federal Trade Commission (“FTC”) unanimously adopted its Policy Statement on Biometric Information and Section 5 of the Federal Trade Commission Act (“Policy Statement”), addressing the increasing use of consumers’ biometric information and the marketing of technologies that use or claim to use it—regarding which the FTC raises significant concerns. In the areas of privacy, data security, and the potential for bias and discrimination. In addition, the Policy Statement also provides a detailed discussion of the established legal requirements applicable to the use of biometrics, particularly those relating to Section 5 of the FTC Act, and lists examples of the practices the agency will scrutinize in determining whether companies’ use of biometric technologies run afoul of Section 5.

Continue Reading FTC’s New Policy Statement on Biometric Information Provides Clear Warning to Companies on Increased Scrutiny of Facial Recognition & Related Biometrics Practices

One of the most notable trends in Illinois Biometric Information Privacy Act (“BIPA”) class action litigation is the marked increase in the number of class actions targeting third-party biometric technology vendors, such as identity authentication systems and employee timekeeping devices. Importantly, because these vendors do not maintain any direct relationship with the end users of their technology, compliance with Illinois’s biometric privacy statute—especially its notice and consent requirements—can be a challenging undertaking. Despite this, to date, the majority of courts have held that BIPA nonetheless applies equally to vendors vis-à-vis employers and other entities that maintain direct relationships with biometric data subjects.

Earlier this month, an Illinois federal court rejected a selfie ID facial recognition identity verification vendor’s bid for dismissal of a BIPA class action in Davis v. Jumio Corp., No. 22 CV 776, 2023 WL 2019048 (N.D. Ill. Feb. 14, 2023). The Davis decision illustrates the scope of exposure faced by vendors for alleged non-compliance with BIPA, as well as the challenges and complexities in obtaining dismissals of biometric privacy class actions prior to the commencement of costly discovery.

Background

Plaintiff maintained a membership with the online cryptocurrency marketplace operated by Binance. Jumio Corporation provides facial recognition identity verification services for its clients, including Binance. Plaintiff sued Jumio, alleging that the company violated BIPA’s Section 15(b) notice and consent requirements when it collected his biometric data during the process of verifying his identity for Binance.

Jumio moved to dismiss the class action pursuant to Federal Civil Rule 12(b)(6). Jumio raised two arguments in support of dismissal. First, Plaintiff’s suit was barred by BIPA’s financial institution exemption. Second, dismissal of the complaint was warranted under Illinois’s extraterritoriality doctrine.

The Decision

The court first considered whether BIPA’s exemption for financial institutions precluded Plaintiff’s claims against Jumio. BIPA Section 25(c) provides that “[n]othing in this Act shall be deemed to apply in any manner to a financial institution or an affiliate of a financial institution subject to Title V of the federal Gramm-Leach-Bliley Act of 1999 [(“GLBA”)] and the rules promulgated thereunder.”

In raising this argument, Jumio did not contend that it was a financial institution itself; rather, Jumio argued that Binance was a financial institution and, as a result, applying BIPA to Jumio in connection with use of the Binance App would effectively result in applying BIPA to Binance, an action that is proscribed by BIPA.

The court disagreed, finding several flaws in Jumio’s argument. First, the court rejected consideration of materials submitted by Jumio in support of its motion to dismiss, which Jumio had argued allowed the court to take judicial notice of Binance’s qualification as a financial institution for purposes of BIPA’s Section 25(c) exemption. The court instead held that “Binance’s self-serving statements (such as characterizing itself as a financial institution in other litigation to avoid liability under BIPA) need not be accepted as true and do not support taking judicial notice of the contested fact that Binance is, in fact, a financial institution.” Additionally, the court also held that the allegations in the complaint were similarly inadequate to demonstrate Binance’s status as a financial institution, as other than using the term “cryptocurrency marketplace,” the complaint contains no further factual allegations about the financial activities of Binance.

Second, the court found that even if Binance was found to be a financial institution within the meaning of the GLBA—thus triggering the Section 25(c) exemption—it did not necessarily follow that the claim against Jumio was barred. In so doing, the court rejected Jumio’s argument that because its software was embedded and integrated into the Binance App, BIPA would be applied to Binance “in any manner” in contravention of Section 25(c) in the event the court granted the Plaintiff’s requested relief under the Illinois biometrics law. The court explained that even if Jumio were ordered to comply with BIPA’s notice and consent requirements, Jumio might have to modify the software it provided to Binance; Binance, however, would still nonetheless have no affirmative obligation under BIPA to change the Binance App. Without further information regarding how the Binance App functioned and how Jumio’s software was integrated into the Binance App, the court was unable to determine the extent to which requiring Jumio’s compliance with BIPA would necessitate changes to how Binance did business, such that BIPA could be construed as applying “in any manner” to Binance.

Accordingly, the court declined to dismiss the class action pursuant to BIPA’s financial institution exemption.

The court then turned to Jumio’s argument that Illinois’s extraterritoriality doctrine barred Plaintiff’s lawsuit. In Illinois, a statute is without extraterritorial effect unless a clear intent appears from the express provisions of the statute. Both parties agreed that BIPA did not apply extraterritorially. Therefore, for BIPA to apply to Jumio’s conduct, the circumstances giving rise to the suit must have occurred “primarily and substantially in Illinois.”

Jumio argued that the complaint did not allege that any relevant conduct giving rise to the class action occurred in Illinois, aside from Plaintiff’s allegation that he was an Illinois resident. Notably, after Jumio filed its motion to dismiss, Plaintiff added allegations in his response brief to bolster his opposition to Jumio’s extraterritoriality argument. In its reply, Jumio posited that dismissal was still warranted, as Plaintiff’s new allegations failed to allege that any of Jumio’s conduct took place within the borders of Illinois.

Considering the allegations in the complaint, as supplemented by additional facts in his response brief, the court found that Plaintiff sufficiently alleged a plausible claim that Jumio’s BIPA violations occurred primarily and substantially in Illinois. Specifically, the court found that the following allegations, without more, were enough at the pleading stage to avoid dismissal based on Jumio’s extraterritoriality argument: (1) Plaintiff was an Illinois resident; (2) Jumio conducted business transactions in Illinois; and (3) Plaintiff submitted photographs of his driver’s license and face through the Binance App while in Illinois.

Analysis & Takeaways

Continued Trend of Broad Exposure for Third-Party Biometrics Vendors and Service Providers

Since the start of the year, the Illinois Supreme Court has issued two notable plaintiff-friendly opinions, which resolved the uncertainty surrounding the applicable statute of limitations for BIPA claims and the issue of claim accrual in BIPA litigation, respectively, and significantly expanded the scope of potential liability exposure for BIPA non-compliance even further in the process. However, the applicability of BIPA to third-party vendors continues to persist as a significant area of ambiguity. To date, the majority of courts to analyze the issue have held that BIPA is applicable to vendors and service providers, even if they do not directly interface with end users. This line of reasoning was most recently affirmed in early February 2023 by an Illinois federal court in Johnson v. NCR Corp., No. 22 CV 3061, 2023 WL 1779774 (N.D. Ill. Feb. 6, 2023) (for more information on the Johnson opinion, you can read Privacy World team member David Oberly’s article analyzing the decision for Biometric Update here).

Davis further illustrates the potential perils that vendors face if they fail to satisfy the full range of BIPA compliance requirements when offering biometrics-related products and services to their commercial clients.

Scope of BIPA’s Financial Institution Exemption Not Unlimited

To date, the Section 25(c) financial institution exemption has been one of the most robust defenses to BIPA class actions, resulting in the dismissal of a number of defendants not traditionally known as “financial institutions,” such as colleges and universities. The Davis decision, however, demonstrates that the contours of the financial institution exemption are not unlimited.

In rejecting the vendor’s assertion of the financial institution exemption as a bar to the BIPA claims asserted against it, the Davis court relied primarily on the lack of sufficient evidence demonstrating that the defendant’s customer was, in fact, a financial institution entitled to seek refuge under BIPA Section 25(c). The reasoning of the Davis court comports with other courts that have denied motions to dismiss asserting BIPA’s financial institution exemption as a complete defense to liability—which have also found inadequate evidence demonstrating that the defendant or a related entity satisfied the GLBA’s definition of a financial institution so as to make Section 25(c) applicable to bar BIPA claims.

Importantly, Davis illustrates that defendants seeking dismissal pursuant to the financial institution exemption need to ensure that their motions are properly supported with sufficient evidence to permit a finding that Section 25(c) applies to the specific activities engaged in by the entity at issue in order to maximize the likelihood of a favorable outcome on a motion seeking to definitively end class action litigation. This task is especially critical when pursuing motions to dismiss, where the scope of evidence that can be considered by the court is curtailed.

Challenges Faced by Defendants in Procuring Dismissals from BIPA Litigation at the Pleading Stage

BIPA class actions have been challenging to defeat at the pleading stage, which is due to a combination of factors that include the deference given to Plaintiff’s allegations for purposes of a motion to dismiss, the lack of guidance offered to courts by BIPA’s statutory text, and courts’ willingness to interpret BIPA’s compliance requirements in a manner that heavily favors the plaintiff’s bar.

Davis is a textbook example of these challenges that are often faced by defendants in attempting to obtain dismissals of BIPA disputes before proceeding to the discovery phase of litigation. Of note, although courts are generally only permitted to consider the allegations in the complaint on a motion to dismiss, the Davis court permitted the Plaintiff’s elaborations to the complaint’s factual allegations in his response brief to be considered in ruling on the defendant’s motion to dismiss. Further, the court found that the Plaintiff’s allegations were sufficient at the pleading stage to plausibly allege circumstances that the alleged BIPA violation occurred in Illinois so as to avoid dismissal on extraterritoriality grounds, even though the Plaintiff only alleged a single fact relating directly to the defendant’s conduct—that it engaged in business transactions in Illinois. More than that, in rejecting Jumio’s extraterritoriality argument, the court acknowledged that discovery might reveal that the connection to Illinois is “sufficiently tenuous” as to warrant revisiting the matter at summary judgment, but that was not enough to prevent the case from moving past the pleading stage.

To mitigate BIPA litigation risk, all types of entities that use biometric data in their operations should consider taking a conservative approach to compliance—one that ensures all applicable BIPA requirements are satisfied—even where it is not definitively clear that Illinois’s biometrics statute applies to organizational operations.

Specifically, companies should ensure they maintain flexible, comprehensive biometric privacy compliance programs, which should include (among other things) the following:

  • A publicly-available, biometrics-specific privacy policy;
  • Set data retention and destruction guidelines and schedules containing a clear and unambiguous description of the event trigger(s) that will prompt the immediate and permanent destruction of an individual’s biometric data;
  • A mechanism for ensuring written notice is supplied to all data subjects before the time biometric data is collected; and
  • A separate mechanism for ensuring written consent is obtained, allowing the vendor to collect, possess, retain, store, and disseminate biometric data before the time any such data is obtained.

For more, stay tuned. Privacy World will be there to keep you in the loop.

Last month, Kristin Bryan and Kyle Fath discussed the rapidly evolving realm of biometric data law and offered unique perspectives, both from advisory and litigation standpoints, on the complex challenges and concerns associated with the privacy in the area of biometrics.

Kristin and Kyle discuss biometric data and the current and forthcoming legal and regulatory landscape, including the Illinois Biometric Information Privacy Act (“BIPA”), a summary of litigation and regulatory trends, with a focus on BIPA class action litigation, and how to provide practical and actionable advice to your business teams in the development, acquisition, or licensing of biometrics or biometrics-adjacent technology.

If you have questions about biometric data law, contact your SPB relationship partner for further information.

Several months ago, you may have seen social media filled with artistic renditions of your connections as paintings, cartoons, or other artistic styles. These renditions came from Lensa, an app by which users upload “selfies” or other photos, which the app processes to generate artistic images of the user. Lensa, which is owned by Prisma Labs, Inc., is the latest subject of a putative class action brought under the Illinois Biometric Information Privacy Act (“BIPA”).

In Flora, et al., v. Prisma Labs, Inc., No. 5:23-cv-00680 (N.D. Cal.), Plaintiffs—a group that includes a minor child—are residents of Illinois who used the Lensa app to create artistic images of themselves. Plaintiffs allege that they used Lensa in December 2022, after the app exploded in popularity in November 2022 due to the launch of the “magic avatars” feature, which requires users to upload at least eight images of themselves (and up to 20 images) to create artistic, stylized “avatars” of the user’s face. The app can also be used to upload images of others, and create avatars based on those images. Plaintiffs allege that Lensa’s privacy policy as of December 2022 did not inform users that their facial geometry would be collected to create the avatars, and that several oblique references to Lensa’s use and processing of users’ images lead users to believe that their biometric data is “anonymized” and does not leave the user’s device—which seemingly contradicts Lensa’s model of collecting users’ images and generating avatars based on those images. The Complaint also alleges that Lensa’s privacy policy temporarily disclosed that “face data” will be used to “train” its “neural network algorithms,” but that the provision was subsequently removed, and never included provisions of how that data would be protected or disclosed.

Based on the allegations in the Complaint, Plaintiffs seek to represent a class of “All persons who reside in Illinois whose biometric data was collected, captured, purchased, received through trade, or otherwise obtained by Prisma, either through use of the Lensa app or otherwise.” Plaintiffs bring seven causes of action under Sections 15(a), 15(b)(1), 15(b)(2), 15(b)(3), 15(c), 15(d), and 15(e) of BIPA, as well as an additional claim for unjust enrichment based on Lensa’s paid subscription service.

The Complaint also raises additional concerns about Lensa’s business model and methods of generating images. For example, upon downloading the app, a user is prompted to begin a seven-day trial subscription with Lensa; the Complaint alleges that the app uses dark patterns to prompt users to choose this option, rather than closing out of it and declining the trial subscription. The Complaint also alleges that Lensa uses Stable Diffusion to generate images, which is an open-source AI model trained on over 2 billion copyrighted images, including images that are protected by copyright. As alleged in the Complaint, the system could violate the intellectual property rights of artists who own the copyrights in the images used to train the AI model.

Flora is similar to past BIPA class actions brought against apps that allow users to virtually “try on” makeup, clothing, or other beauty items, as well as class actions brought against entities that use images to “train” models of AI. Plaintiffs are represented by Loevy & Loevy, which notably prevailed in the first BIPA case to go to trial, Rogers v. BNSF Railway Company. Privacy World will continue to keep an eye on how this case develops for you.

Today, the Illinois Biometric Information Privacy Act (“BIPA”) remains one of the hottest areas of class action litigation. Despite the high volume of class action filings, however, many significant aspects of Illinois’s biometrics statute remain unsettled and uncertain. One of the most notable open-ended issues pertains to the applicability of BIPA to third-party vendors and service providers, such as the developers and manufacturers of biometrics technologies. To date, the majority of courts to analyze the issue have found that BIPA is squarely applicable to vendors and similar entities that do not directly interface with end users. David Oberly analyzes a recent decision—Johnson v. NCR Corp.—that continues the trend of courts finding in favor of broad BIPA liability exposure for third-party vendors, as well as the implications of the opinion, in this Biometric Update article: Lessons Learned From Recent BIPA Third-Party Vendor Decision.

New Year, New Bills

As Privacy World reported, 2022 saw a plethora of class action litigation stemming from alleged non-compliance with the well-known Illinois Biometric Information Privacy Act (“BIPA”). At the same time, due to concerns about companies using biometrics in a safe and responsible manner, lawmakers from coast to coast also attempted (albeit unsuccessfully) to put in place their own regulation to govern biometrics during the 2022 legislative cycle. Predictably, much of the same has taken place at the outset of 2023, with a total of nine states having already introduced biometrics-focused legislative proposals in January alone.

Below, we break down the bills that have been introduced in 2023 and what they would mean for companies if enacted.

Background: What Does BIPA Do?

 As a refresher: under BIPA, which was enacted in 2008 and was the first state biometric privacy bill in the U.S., companies that collect or possess biometric data must adhere to a range of core compliance obligations:

  • Under Section 15(a), companies must maintain a publicly-available privacy policy which includes, at a minimum, the company’s schedule and guidelines for permanently destroying biometric data.
  • Under Section 15(b), companies must provide notice and obtain consent before collecting biometric data.
  • Under Section 15(c), companies must refrain from leasing, trading, selling, or otherwise profiting from biometric data.
  • Under Section 15(d), companies must refrain from disclosing biometric data unless consent is first obtained for the disclosure or, alternatively, if one of three exemptions applies to the disclosure.
  • Finally, under Section 15(e) companies must maintain security measures designed to safeguard biometric data.

Individuals “aggrieved” by a violation of BIPA may pursue class action litigation for non-compliance and are entitled to recover $1,000 per each negligent violation of the law and $5,000 per intentional or reckless violation, along with attorney’s fees.

Breakdown of 2023 Biometric Privacy Legislative Proposals

 Notably, the bills introduced at the outset of 2023 are all closely patterned after BIPA—imposing similar compliance obligations and providing for the ability for individuals to pursue class action litigation for mere technical non-compliance. With that said, a number of the 2023 biometrics bills also contain unique compliance requirements that are not found in any active state or municipal biometrics statutes currently in effect. The end result of these pieces of legislation is that companies may see expanded exposure similar to that of BIPA, as well the need to make significant changes to their existing biometric privacy compliance programs.

Arizona “Act Relating to Biometric Information”

 On January 30, Arizona lawmakers introduced the “Act Relating to Biometric Information” (SB 1238). SB 1238 is a carbon copy of BIPA—imposing identical compliance obligations, as well utilizing a private right of action as the bill’s exclusive enforcement mechanism that allows anyone “aggrieved” by a violation of the law to pursue class action litigation and the recovery of statutory damages of $1,000 for each negligent violation of the law and $5,000 for each intentional or reckless violation.

 Hawaii Biometric Information Privacy Act

 On January 20, Hawaii lawmakers introduced the Biometric Information Privacy Act (SB 1085). SB 1085 parallels BIPA’s compliance obligations almost completely, with one major exception. Specifically, the Hawaii bill provides a single, fairly narrow exemption from its data retention and destruction obligations, allowing companies to retain biometric data for a longer period of time than is prescribed by law where the retention of such data is required for legal compliance purposes. Also like BIPA, the Hawaii bill utilizes a private right of action as its exclusive enforcement mechanism, allowing for the recovery of $1,000 to $5,000 per violation of the law.

If enacted, the Hawaii BIPA would take effect immediately upon its approval—which could create significant compliance challenges for companies that utilize biometric data in their commercial operations, especially if they do not have any type of biometrics compliance program in place at this time.

Maryland Biometric Data Privacy Act

 On January 11, the Maryland House of Representatives introduced the Maryland Biometric Data Privacy Act (“BDPA”) (HB 33). A week later, the Maryland Senate introduced an identical bill (SB 169). Of note, in 2022 the Maryland House passed an identical biometrics bill (HB 259), but this legislation ultimately failed to garner enough support by the state’s Senate to become law.

Importantly, the BDPA not only incorporates many of BIPA’s core compliance obligations—such as informed consent—but also includes a number of additional provisions that have traditionally been seen only in connection with broader consumer privacy statutes. As just one example, the BDPA provides data subjects with the “right to know,” which would compel companies to disclose a range of pieces of information regarding their collection and use of biometric data upon request. In addition, the BDPA’s data retention and destruction requirements mandate that covered businesses destroy biometric data within 30 days after a business receives a data subject request for the deletion of their biometric data—in essence, creating a consumer “right to delete” that must be adhered to by companies that fall under the scope of the legislation.

The other main distinction between the Maryland bill and Illinois’s BIPA pertains to their respective enforcement provisions. Unlike BIPA, which provides a private right of action as its exclusive enforcement mechanism, the Maryland bill not only includes a private right of action, but also affords the state’s attorney general with the authority to impose civil penalties of up to $10,000 per violation.

If enacted, the BDPA would go into effect on October 1, 2022—providing only minimal time for companies to build out or otherwise modify their compliance programs to achieve compliance with the BDPA.

 Massachusetts Biometric Information Privacy Act

On January 20, Massachusetts lawmakers in both the House and Senate filed similar biometric privacy bills—referred to as the Massachusetts Biometric Information Privacy Act (HD 3053 and SD 2218). These two bills are similar to BIPA, but both depart from the Illinois law in several key respects.

Specifically, compared to BIPA, HD 3053:

  • Provides more detailed, granular privacy policy disclosure requirements, as well as a requirement that covered businesses provide notice of any change in its policy to data subjects at least 20 days before any privacy policy change goes into effect;
  • Includes a unique prohibition on the use of biometric data for “monetization” purposes; and
  • In addition to providing a private right of action allowing for class litigation, the bill authorizes the state attorney general to pursue civil penalties for violations of the Massachusetts law.

Similarly, compared to BIPA, SD 2218:

  • Introduces a unique compliance obligation that bars “commercial establishments”—defined as a “place of entertainment, retail store, or food and drink establishment”—from using any biometric data for identification (surveillance) purposes;
  • Allows the state AG to impose civil penalties for violations of the law; and
  • Provides higher damages awards in class action litigation; specifically, “no less” than $5,000 per violation (regardless of whether the violation was negligent or intentional/reckless), as well an additional damages award multiplier ranging from two to three times the original statutory damages award if the court finds that the violation was done willfully or knowingly.

Minnesota “Act Relating to Private Data and Establishing Standards for Biometric Privacy”

 On January 30, Minnesota lawmakers introduced the Minnesota Biometric Privacy Act (SF 954). SF 954 is also similar to BIPA—containing identical compliance requirements and available remedies for non-compliance with the law.

Mississippi Biometric Identifiers Privacy Act

 On January 12, Mississippi lawmakers introduced the Biometric Identifiers Privacy Act (HB 467). The Mississippi BIPA was very similar to the bills currently pending in the Maryland House and Senate (HB 33 and SB 169), in that the Mississippi legislation contained a number of consumer rights ordinarily confined to broader consumer privacy statutes. With that said, the Mississippi BIPA died in committee on January 31, eliminating the prospect of new biometrics regulation in the Magnolia State—at least for 2023.

 New York Biometric Privacy Act

On January 17, 2023, New York lawmakers introduced the New York Biometric Privacy Act (AB 1362). The Empire State is no stranger to proposed biometrics legislation, having introduced identical bills during the two previous legislative cycles. The New York BPA also resembles Illinois’s BIPA—providing identical compliance obligations and the recovery of statutory damages ranging from $1,000 to $5,000 per violation in class action litigation. If enacted, the BPA would take effect 90 days after having become law.

 New York “Act Prohibiting Use of Facial Recognition System by Landlords on Residential Premises”

 In addition, on January 4 New York lawmakers also introduced a unique piece of legislation that prohibits the use of facial recognition technology by landlords on any residential premises in the state (AB 322). As many know, New York City recently enacted its Tenant Data Privacy Act (“TDPA”), which imposes a range of requirements and restrictions on the use of all types of biometrics by owners and landlords in apartment complexes and similar types of residential housing. With AB 322, New York has gone a step further by attempting to impose a blanket ban over facial biometrics use by Empire State landlords and property owners.

AB 322 defines facial recognition for purposes of the prohibition as both: (1) the automated or semi-automated process by which a person is identified or attempted to be identified based on the characteristics of their face, including identification of known or unknown individuals or groups; and (2) the automated or semi-automated process by which a person is identified or attempted to be identified based on the characteristics of their face, including identification of known or unknown individuals or groups. The bill defines “face recognition system” as “any computer software or application that performs facial recognition.”

Under AB 322, landlords are prohibited from obtaining, retaining, accessing, or using—on any residential premises: (1) any facial recognition system; or (2) any information obtained from, or by use of, a facial recognition system. AB 322 provides for both AG enforcement of civil penalties for non-compliance with the law, as well as a private right of action allowing data subjects to pursue $1,000 in statutory damages for each violation of the legislation through class action litigation.

New York “Act Prohibiting Private Entities From Using Biometric Data for Advertising”

Lastly, on January 20 New York lawmakers introduced a second unique pieces of legislation, this time focused on targeting the use of facial recognition for advertising and marketing purposes (AB S2390). This bill seeks to ban private companies from using biometric data for any advertising, detailing, marketing, promotion, or other related activities that are intended to influence sales, as well as any evaluation of the effectiveness of marketing practices. Absent from AB S2390 is any language providing for an enforcement mechanism for violations of the law. If enacted, this bill would take effect 30 days after it becomes law.

 Tennessee Consumer Biometric Data Protection Act

 On January 23, Tennessee lawmakers introduced the Tennessee Consumer Biometric Data Protection Act (SB 339). SB 339 is nearly identical to BIPA in terms of its compliance obligations and enforcement mechanism.

SB 339 diverges from BIPA’s statutory text by including detailed language focused on ascertaining the number of violations committed by a private entity. This particular language was likely included in the bill to avoid the uncertainty that has caused significant complexities and challenges for defendants in BIPA class action litigation pertaining to this issue, known as “claim accrual.” The Illinois Supreme Court is set to provide a definitive resolution on the issue of claim accrual in BIPA litigation when it renders its opinion in Cothron v. White Castle Sys., No. 128004, currently pending before the Court at this time.

For additional information on the Cothron Illinois Supreme Court appeal, see our extensive Privacy World coverage here, here, here, and here.

 If enacted, the Tennessee biometrics law would take effect on January 1, 2024.

 Vermont “Act Relating to Protection of Personal Information”

On January 26, Vermont legislators introduced “An Act Relating to Protection of Personal Information” (H 121), which departs significantly from BIPA, including in regards to:

  • Inclusion of detailed content criteria for providing individualized notice prior to the collection of biometric data;
  • More flexibility in obtaining consent from data subjects, including through verbal assent or in any other way that is reasonably calculated to collect informed, confirmable consent; and
  • An obligation to implement a mechanism to prevent the subsequent use of biometric data before any such data is collected or retained.

Moreover, unlike BIPA, H 121 offers both class action litigation and AG enforcement of civil penalties as enforcement methods for non-compliance with the Vermont biometrics law. If enacted, the Vermont legislation will take effect on July 1, 2023.

Mitigating Biometric Privacy Risk Going Forward

 Monitor Closely for Additional Legislative Developments

As we noted earlier this year, as businesses across all industries increase their reliance on biometric data to improve the efficiency of their operations and satisfy consumers’ growing interest in this next-generation technology, lawmakers are also greatly increasing their efforts to enact tighter regulations over the collection and use of biometric data.  As this area of regulation continues to develop, be sure to stick with Privacy World: we’ve got you covered.

 In addition, readers are also strongly encouraged to join SPB’s Kyle Fath and Kristin Bryan for a timely webinar on the evolving landscape of laws around biometric data. The program will offer an engaging discussion, including the advisory and litigation perspectives relating to privacy in the specific context of biometrics. Importantly, during the webinar Kyle and Kristin will provide a deep dive into many of the biometric privacy bills discussed in this post, as well as strategies for how companies can get ahead of the compliance curve by implementing proactive modifications to their biometrics compliance programs that take into consideration the common compliance components and themes of the biometric privacy legislation introduced to date in 2023.  

 For additional information and to register for the webinar, click here: The Expanding Landscape of Biometric Data Law: Where We Are and What’s to Come

Join SPB’s Kyle Fath and Kristin Bryan for a timely webinar on the evolving landscape of laws around biometric data.  The program will offer an engaging discussion including the advisory and litigation perspectives related to privacy in the context of biometrics.

Key areas of focus will include:

  • Biometric data and the current and forthcoming legal and regulatory landscape, including the Illinois Biometric Information Privacy Act (BIPA)
  • A summary of litigation and regulatory trends, with a focus on BIPA class action litigation
  • How to provide practical and actionable advice to your business teams in the development, acquisition or licensing of biometrics or biometrics-adjacent technology

For additional information and to register, click here.

While Madison Square Garden might normally make headlines for musical artists or sporting events, the venue’s parent company, MSG Entertainment, has been in the spotlight following media and regulator attention regarding its use of facial recognition technology to ban certain individuals from its venues. Read on to learn more and its implications for other uses of facial recognition technology.

First, some background.  MSG Entertainment’s use of biometric facial recognition came under scrutiny last December, when an attorney employed by a law firm engaged in litigation against MSG Entertainment was denied entry from attending the Radio City Christmas Spectacular with her child. She was apprehended by the venue’s security staff, who knew her name and firm she was associated with, and purportedly informed her she had been identified by the venue’s facial recognition system as part of an “attorney exclusion list.”

This was not the only instance in which an attorney was seemingly denied entry based solely on being an attorney who is personally or whose firm is engaged in litigation against MSG. Based on several news reports, the company has a policy of excluding from its venues not only attorneys representing parties engaged in litigation against MSG Entertainment, but also all attorneys employed by the firms engaged in those litigations, and uses software to identify those attorneys from their photos on the firms’ websites. For example, a Long Island attorney was banned from MSG before a Knicks-Celtics game after her law firm filed a suit on behalf of a fan who fell from a skybox at MSG during a Billy Joel concert and another attorney was stopped from entering a MSG for a Rangers game because the attorney was employed by a firm suing MSG.

At least two law firms filed suit against MSG Entertainment in December 2022 over the ban. Although these suits did not raise biometric or AI-based claims, they alleged violations of New York state civil rights laws and prima facie tort claims and requested declaratory judgment in addition to a temporary restraining order, preliminary injunction, and permanent injunction. The ban has been met with criticism from the judges presiding over these actions, including Chancellor Kathaleen McCormick of the Delaware Chancery Court, who remarked that MSG Entertainment’s letter reinforcing the ban was “the stupidest thing [she’d] ever read.”

The debate over MSG Entertainment’s facial recognition software illustrates the divide between consumer perception of using facial recognition for authentication or verification purposes, which has generally become more accepted, versus using such technology for real-time surveillance or identification outside of the context of express consumer consent.

This shifting public perception of the various purposes for which facial recognition may be utilized is also congruent with recent legislative activity.  For example, in response to the recent events at MSG Entertainment’s venues, a bill was introduced in the New York state legislature to add “sporting events” to the list of public places of entertainment that are barred from refusing entry to individuals with a valid ticket.  New York State Senator Brad Hoylman-Sigal condemned MSG Entertainment’s policy, stating, “MSG claims they deploy biometric technology for the benefit of public safety when they remove sports fans from the Garden. This is absurd given that in at least four reported cases, the patrons who were booted from their venues posed no security threat and instead were lawyers at firms representing clients in litigation with MSG.”

Although the bill does not specifically address the use of facial recognition technology, it would nonetheless work to limit the ways in which such technology is used.  Similarly, New York Attorney General Letitia James penned a letter to MSG Entertainment warning that the ban could violate anti-discrimination laws and could chill attorneys from taking on certain types of litigation against the company.

Biometric technology has been a focus of state regulation for some time, most significantly with Illinois’ Biometric Information Privacy Act (“BIPA”); Texas’ Capture or Use of Biometric Identifier Act (“CUBI”); and Washington’s HB 1493. While BIPA is considered the most stringent of the three state statutes, each imposes certain requirements relating to notice, consent, and data security measures for biometric information or identifiers. BIPA also contains a private right of action permitting for the recovery of statutory damages, which has made it a frequent target for class action litigation. New biometric privacy bills have also recently been introduced in New York, Hawaii, Mississippi, and Maryland, which would similarly regulate the collection and use of all forms of biometric data.

Lawmakers have also enacted legislation at a local level to govern the use of facial recognition technology and, more specifically, to thwart potential improper uses of the technology. In late 2020, Portland, Oregon became the first U.S. jurisdiction to ban the use of facial recognition by the private sector, clarifying in the prefatory materials for the ordinance that lawmakers were primarily concerned with the use of facial recognition for surveillance purposes within physical spaces and its corresponding potential risks for misidentification and misuse. New York City has already enacted a municipal-level ordinance regulating the use of biometrics-powered technologies by “commercial establishments.”

As a result of certain high-profile incidents, including those discussed above related to MSG Entertainment, more states may be inclined to enacted biometric privacy bills modeled after BIPA (or taking a more tailored approach to still provide certain protections regulating biometric privacy concerns). Simultaneously, these developments may also encourage lawmakers contemplating regulating the use of this technology in other jurisdictions—but who have not yet introduced legislation and who lack an appetite for passing an outright ban—to push forward with additional biometric regulations.

MSG Entertainment is due to respond to Attorney General Letitia James’s Letter by February 13, 2023 “to state the justifications for the Company’s Policy and identify all efforts you are undertaking to ensure compliance with all applicable laws and that the Company’s use of facial recognition technology will not lead to discrimination.” For updates on MSG Entertainment’s response and other developments relating to facial recognition software in New York, Privacy World will be there to keep you in the loop.

2022 was another year of high activity and significant developments in the realm of artificial intelligence (“AI”) and biometric privacy related matters, including in regard to issues arising under the Illinois Biometric Information Privacy Act (“BIPA”) and others.  This continues to be one of the most frequently litigated areas of privacy law, with several notable rulings and emerging patterns of new activity by the plaintiffs’ bar.  Following up on Privacy World’s Q2 and Q3 2022 Artificial Intelligence & Biometric Privacy Quarterly Newsletters, be sure to read on for a recap of key developments and insight as to where 2023 may be headed.

Continue Reading Privacy World 2022 Year in Review: Biometrics and AI

Welcome to the 2022 Q3 edition of the Artificial Intelligence & Biometric Privacy Report, your go-to source for keeping you in the know on all recent major artificial intelligence (“AI”) and biometric privacy developments that have taken place over the course of the last three months. We invite you to share this resource with your colleagues and visit Squire Patton Boggs’ Data Privacy, Cybersecurity & Digital Assets and Privacy & Data Breach Litigation homepages for more information about our capabilities and team.

Also, we are extremely pleased to announce that our own Kristin Bryan was named as a 2022 Law360 Cybersecurity & Privacy MVP. As Law360 notes, “[t]he attorneys chosen as Law360’s 2022 MVPs have distinguished themselves from their peers by securing hard-earned successes in high-stakes litigation, complex global matters and record-breaking deals.” You can read more about Kristin’s Law360 award here: Law360 MVP Awards Go to 188 Attorneys From 78 Firms.

Continue Reading 2022 Q3 Artificial Intelligence & Biometric Privacy Report