Texas Enforcement of Biometric Law Focuses on Artificial Intelligence | Holland & Knight LLP

As many organizations turn their attention to compliance with state consumer protection laws that will go into effect in 2023, they should consider that in 2022 the Texas Attorney General has now filed two cases involving the collection of biometric data, including those relating to the development or use of artificial intelligence (AI) models based on machine learning. The potential consequences of failing to comply with the Texas Capture or Use of Biometric Identifier Act (CUBI) can be significant. CUBI has civil penalties of up to $25,000 per violation, and the volume of data required for machine learning dramatically increases the number of potential violations. This paper examines Texas AG’s broad interpretation of CUBI and identifies some compliance considerations for organizations processing biometric data in the context of AI implementation.

CUBI background

CUBI governs the collection, receipt, possession, disclosure and retention of biometric identifiers. This Texas statute (an older statute) uses a list-only approach to defining “biometric identifiers,” specifically: “a retinal or iris scan, fingerprint, voiceprint, or a recording of hand or facial geometry.”

Under CUBI, organizations are generally prohibited from collecting biometric identifiers for commercial purposes unless they provide prior notice and consent from the data subject. (The term “commercial purpose” is not legally defined.) The sharing of biometric identifiers must be restricted. Organizations must protect biometric identifiers with reasonable care and normally destroy them within a reasonable time – no longer than one year after the purpose of their collection ends. Specifically, only Texas AG can bring an action under CUBI; there is no right of private prosecution.

Wide interpretations of CUBI in Facebook and Google petitions

Earlier this year, Texas AG (with the help of private law firms) filed a lawsuit against Facebook for alleged violations of CUBI. Piggybacking on a civil settlement, Texas AG alleged that Facebook’s photo “tag suggestions” feature collects biometric identifiers without disclosure or obtaining consent. Last month, Texas AG filed a second CUBI lawsuit – this time against Google. Texas AG alleges that Google’s products capture facial geometry from photos and videos and (for Google Assistant) voiceprints from recognized voices in violation of CUBI.

Facebook and Google reportedly use biometrics not only for obvious purposes (suggestions, groupings, and guidance) but also to train and improve their facial and speech-recognition AI models.

CUBI regulates the collection and use of biometric identifiers for “commercial purposes”. The Facebook and Google petitions suggest that the purpose of improving AI models alone might be enough to be a commercial purpose under CUBI, according to Texas AG. Thus, if the underlying use is commercial in nature, this means that almost any collection or use of biometric identifiers in relation to Texas residents in connection with the development of AI models would require CUBI compliance.

Texas AG also makes it clear in its petitions that the sharing of biometric identifiers between affiliates would be considered by Texas AG as “disclosure” subject to CUBI restrictions. Given this interpretation, organizations should intentionally and diligently pay attention to which affiliates collect, handle and use biometric identifiers so that they can ensure that all such processing is CUBI compliant.

Texas AG takes the position that CUBI alone governs the collection and use of biometric identifiers, even without other identifying information. The petitions against Facebook and Google do not allege that Facebook and Google collected any other information along with the biometric identifiers or that Facebook and Google had the ability to determine who individuals were. In fact, Texas AG has complained about the collection of biometric identifiers in relation to non-users of Facebook’s and Google’s products. Therefore, organizations take a risk even when collecting and using biometric identifiers about unidentified individuals.

Texas AG has also taken the position that unauthorized collection and storage of biometric identifiers result in two separate CUBI violations. As Texas AG stated, “Because Facebook’s possession of biometric identifiers was primarily unlawful, maintaining possession of those biometric identifiers for any period of time is improper and violates the law.” [CUBI].” As a result, $25,000 per violation can become $50,000 or more for each non-compliant capture of a biometric identifier.


The risks under CUBI and other data protection laws increase as more protected data is collected. In particular, organizations should evaluate their collection and handling of biometric identifiers, especially when collecting or using the data to improve AI models. Currently, improving AI models based on machine learning techniques requires huge amounts of data, and AI developers could run significant risks under CUBI and similar laws if they don’t comply. It’s also conceivable that Texas AG would seek to extend its reach to companies that work with AI developers to train AI models and could be viewed as indirectly collecting biometric identifiers.

Organizations should consider the following checklist before collecting biometric identifiers for commercial purposes:

  • Provide appropriate information to data subjects prior to collection
  • Obtaining consent (which is not buried in an agreement) from data subjects prior to capture
  • not disclose biometric identifiers (even between affiliates) except in limited circumstances expressly permitted by law
  • use reasonable care to protect biometric identifiers
  • destroy biometric identifiers within a reasonable time and no later than one year after the end of the purpose of collecting the biometric identifiers

As previously mentioned, any violation can result in significant penalties. In addition, Texas AG may seek permanent injunctions for CUBI violations that could materially impact functions or business activities that depend on the use of collected biometric identifiers.