By Jonathan Maddalone, of Baker & Hostetler LLP
With the United States preparing to host the 2026 FIFA World Cup, the 2028 Summer Olympics in Los Angeles and the 2034 Winter Olympics in Salt Lake City, the race is on for venues, individual states and the federal government to balance the increased security risks of exceptionally large crowds with the Fourth Amendment right to be free from unreasonable searches, the U.S. Foreign Intelligence Surveillance Act, and compliance with a variety of state data privacy and biometric security laws.
A widely covered feature of the 2024 Paris Summer Olympics was the French government’s contracts with security companies to use AI-powered surveillance to provide real-time video analysis to detect potential threats in public spaces. This AI-powered surveillance uses algorithms to identify predetermined “events,” and then sends alerts to human beings who decide if the alert is real and whether to act. These events can include crowds surging toward a gate, a person leaving a backpack on a street corner, certain traffic violations, and smoke or flames. Privacy advocates argue that the surveillance captures, collects and analyzes physiological features and behaviors of individuals, including their movements, gestures and more, which privacy advocates argue is problematic to the extent that this type of information could be considered personally identifiable information (PII).
The extent to which privacy laws are implicated will depend on how the AI-powered system functions, its purpose and the type of information it collects. Before implementing an AI-powered security system, it is important to consider how much and what type of data will be collected and analyzed to identify these events; what happens to and who has access to the data after it is collected; and how the AI-powered system addresses training data, error rates and evidence of bias.
A hotly debated topic concerning AI-powered systems is the use of facial recognition technology to enhance physical security. Navigating the compliance requirements under applicable biometric privacy laws and state comprehensive consumer privacy laws will be critical. For example, states like Illinois, Texas and Washington have implemented biometric privacy laws that may limit a party’s ability to capture, collect or disclose biometric identifiers and/or biometric information without informed consent unless an exemption applies. At the same time, states across the U.S. have passed comprehensive consumer privacy laws that consider biometric information to be a form of “sensitive” information. Some of these states, including California and Utah, are set to host major sporting events such as FIFA World Cup matches and the Olympics, raising nuanced privacy considerations for the state and local governments as well as the venues.
Understanding exceptions to these laws and their applicability is also crucial. For example, California has an exception to the CCPA’s regulations where service providers can retain, use, or disclose personal information obtained while protecting against fraudulent and illegal activity. Cal. Code Regs. Tit. 11, § 999.314. Meanwhile, Texas permits the disclosure of biometric identifiers if the disclosure is made by or to a law enforcement agency for a law enforcement purpose in response to a warrant. Tex. Bus. & Com. Code Ann. § 503.001(c)(1)(D).
The potential for AI-powered surveillance is seemingly endless; it can be used during concerts and sporting events as well as by cities in metro and train stations during heavy use periods. Because the question of whether PII is present in AI models is hotly debated, we expect privacy lawsuits involving these technologies as the law and technology continue to evolve.