Whether it is facial recognition technology being used in connection with COVID-19 screening tools and in law enforcement, continued use of fingerprint-based time management systems, or the use of various biometric identifiers for physical security and access management, applications involving biometric identifiers and information in the public and private sectors continue to grow. Concerns about the privacy and security of that information continue to grow as well. Several states have laws protecting biometric information in one form or another, chief among them Illinois, but the desire for federal legislation remains.

Modeled after Illinois’s Biometric Information Privacy (BIPA), the National Biometric Information Privacy Act (Act), proposed by Sens. Jeff Merkley and Bernie Sanders, contains three key provisions:

  • A requirement to obtain consent from individual prior to collecting and disclosing their biometric identifiers and information.
  • A private right of action against entities covered by the Act that violate its protections which entitles aggrieved individuals to recover, among other things, the greater of (i) $1,000 in liquidated damages or (ii) actual damages, for negligent violations of the protections granted under the law.
  • An obligation to safeguard biometric identifier or biometric information in a manner similar to how the organization safeguards other confidential and sensitive information, such as Social Security numbers.

The Act would apply to “private entities,” generally including a business of any size in possession of biometric identifiers or biometric information of any individual. Federal, state, and local government agencies and academic institutions are excluded from the Act.

Under the Act, private entities would be required to:

  • Develop and make available to the public a written policy establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information. That schedule may not extend one year beyond an individual’s last interaction with the entity, but destruction could be required earlier;
  • Collect biometric identifiers or biometric information only when needed to provide a service to the individuals or have another valid business reason;
  • Inform individuals their biometric identifiers or biometric information is being collected or stored, along with the purpose and length of the collection, storage, or use, and must receive a written release from individuals which may not be combined with other consents, including an employment agreement;
  • Obtain a written release immediately prior to the disclosure of any biometric identifier or biometric information that includes the data to be disclosed, the reason for the disclosure, and the recipients of the data; and
  • Maintain the information using a reasonable standard of care.

Readers familiar with the BIPA in Illinois will find these requirements familiar. Readers familiar with the California Consumer Privacy Act (CCPA) will find the following “Right to Know” familiar as well. The Act would grant individuals the right to request certain information about biometric identifiers or biometric information collected by a covered entity within the preceding 12-month period. This information includes “specific pieces of personal information” and “the categories of third parties with whom the business shares the personal information.” The Act uses “personal information” but does not define it, leaving it unclear if it pertains only to biometric identifiers and biometric information.

Most troubling is the private right of action provision referenced above. The Act uses language similar to the language in the BIPA, which has led to a flood of class action litigation, including a decision by the IL Supreme Court finding plaintiffs need not show actual harm to recover under the law. The legislative process likely will result in some modification to the law, assuming it even survives, a fate privacy laws tend to have at the federal level. Nonetheless, we will continue to monitor the track of this and similar laws.

Pending legislation could create new consumer privacy rights in Massachusetts. Earlier this year, Senator Cynthia Creem presented An Act Relative to Consumer Data Privacy in the Massachusetts Senate. This Consumer Privacy Bill, SD.341, combines key aspects of the California Consumer Privacy Act (CCPA) and Illinois’s Biometric Information Privacy Act (BIPA). This bill would allow Massachusetts consumers a private right of action if their personal information or biometric information (referred to separately in the bill) is improperly collected.

The Consumer Privacy Bill defines “biometric information” as an individual’s physiological, biological or behavioral characteristics, including an individual’s DNA, that can be used, singly or in combination with each other or with other identifying data, to establish individual identity. Biometric information includes, but is not limited to, imagery of the iris, retina, fingerprint, face, hand, palm, vein patterns, and voice recordings, from which an identifier template, such as a faceprint, a minutiae template, or a voiceprint, can be extracted, and keystroke patterns or rhythms, gait patterns or rhythms, and sleep, health, or exercise data that contain identifying information.

The bill defines “personal information” as any information relating to an identified or identifiable consumer. “Personal information” means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or the consumer’s device.

However, this definition does not include publicly available information or consumer information that is deidentified or aggregate consumer information. Moreover, the bill creates an exception for a business collecting or disclosing personal information of the business’s employees so long as the business is collecting or disclosing such information within the scope of its role as an employer. Therefore unlike California’s CCPA, where the application to employee data remains an open question, under the current text of the Massachusetts bill it is pretty clear that the law would not apply to employee data as defined above. That said, it is still early in the legislative process and the bill could be revised to include employee data.

The pending legislation would require businesses collecting a Massachusetts consumer’s personal information to notify the consumer of the following rights before the point of collection:

(1) The categories of personal information it will collect about that consumer;

(2) The business purposes for which the categories of personal information shall be used;

(3) The categories of third parties with whom the business discloses personal information;

(4) The business purpose for third party disclosure; and

(5) The consumer’s rights to request:

                  (A) A copy of the consumer’s personal information;

                  (B) The deletion of the consumer’s personal information; and

                  (C) Opt-out of third party disclosure.

In addition to this notice requirement, the bill would give consumers a statutory right to request that businesses collecting their personal information disclose to the consumer:

(1) The specific pieces of personal information the business has collected about that consumer;

(2) The sources from which the consumer’s personal information was collected;

(3) The names of third parties to whom the business disclosed the consumer’s personal information; and

(4) The business purpose for third party disclosure.

Businesses would have to make available to consumers two or more designated methods for submitting consumer verified requests for personal information, including, if the business maintains a web site, a link on the home page of the web site. A business receiving a verifiable consumer request generally must provide the requested information within 45 days of receiving the request, but may extend that period once by an additional 45 days, so long as the request for the extension is provided within the first 45-day period. The proposed legislation also creates a consumer right to request that a business delete any personal information collected from the consumer, and the right to opt out of third party disclosure at any time.

The legislation would be enforceable both through a private right of action and by the Massachusetts Attorney General. A consumer could recover damages in an amount not greater than $750 per consumer per incident or actual damages, whichever is greater (for any violation of the act); (2) injunctive or declaratory relief, and (3) reasonable attorney fees and costs. The Attorney General would be authorized to obtain a temporary restraining order or preliminary or permanent injunction against a violation of the Act. In addition, the Attorney General may seek a civil penalty of not more than $2,500 for each violation or $7,500 for each intentional violation.

This Consumer Privacy Bill would impose administrative burdens on businesses, including an obligation to train employees, as well as creating new exposure to damages and penalties. Given the litigation we are seeing under BIPA, businesses collecting Massachusetts consumers’ personal information should monitor the progress of this legislation to determine whether they should begin preparations for complying with yet another consumer privacy provision.

 

In 2018, Delta paved the way in airport terminal development, by introducing the first biometric terminal at the Hartsfield-Jackson Atlanta International Airport where passengers can use facial recognition technology from curb to gate. Delta now offers members of its Sky Club airport lounges to enter using fingerprints rather than a membership card or boarding pass. Other airlines use biometric data to verify travelers during the boarding process with a photo-capture. The photograph is then matched through biometric facial recognition technology to photos that were previously taken of the passengers for their passports, visas, or other government documentation.

Though the use of a fingerprint or facial scan aims to streamline and expedite the travel process and strengthen the security of air travel, it also presents heightened security risks for biometric data on a larger sale. As the use of biometric data increases, the more expansive the effects of the data breach becomes. While it’s possible to change a financial account number, a driver’s license number or even your social security number, you can’t change your fingerprint or your face, easily anyway. Furthermore, in the past, facial recognition software had not been able to accurately identify people of color, raising concerns that individuals may be racially profiled.

Yet, many argue that biometric-based technologies can be used to help solve vexing security and logistics challenges concerning travel. For example, in 2016, Congress authorized up to $1 billion collected from certain visa fees to fund implementation of biometric-based exit technology. That was followed by President Trump’s executive order signed in March 2017 directing the Department of Homeland Security to expedite implementation of biometric entry-exit tracking system for all travelers to the United States. As it stands, we are likely to see a rapid expansion of biometric technology used by airlines and other businesses in the travel industry, so prepare your picture perfect travel face!

Notably, the use of biometric data is growing across all industries and in a variety of different applications – e.g., premises security, time management, systems access management. But, so is the number of state laws intending to protect that data. States such as Illinois, Texas, and Washington are leading the way with others sure to follow. Regulations include notice and consent requirements, mandates to safeguard biometric information, and obligations notify individuals in the event biometric information is breached. And, litigation is increasing. The Illinois Supreme Court recently handed down a significant decision, for example, concerning the ability of individuals to bring suit under the Illinois Biometric Information Privacy Act (BIPA). In short, individuals need not allege actual injury or adverse effect, beyond a violation of his/her rights under BIPA. The decision is likely to increase the already significant number of suits, including putative class actions, filed under the BIPA.

Companies, regardless of industry, should be reevaluating their biometric use practices, and taking steps to comply with a growing body of law surrounding this sensitive information.

Earlier today, the Illinois Supreme Court handed down a significant decision concerning the ability of individuals to bring suit under the Illinois Biometric Information Privacy Act (BIPA). In short, individuals need not allege actual injury or adverse effect, beyond a violation of his/her rights under BIPA, in order to qualify as an “aggrieved” person and be entitled to seek liquidated damages, attorneys’ fees and costs, and injunctive relief under the Act.  Potential damages are substantial as the BIPA provides for statutory damages of $1,000 per negligent violation or $5,000 per intentional or reckless violation of the Act.  To date, no Illinois court has interpreted the meaning of “per violation,” but the majority of BIPA suits have been brought as class actions seeking statutory damages on behalf of each individual affected.

If they have not already done so, companies should immediately take steps to comply with the statute. That is, they should review their time management, point of purchase, physical security, or other systems that obtain, use, or disclose biometric information (any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry used to identify an individual) against the requirements under the BIPA. In the event they find technical or procedural gaps in compliance – such as not providing written notice, obtaining a release from the subject of the biometric information, obtaining consent to provide biometric information to a third party, or maintaining a policy and guidelines for the retention and destruction of biometric information – they need to quickly remedy those gaps.  For additional information on complying with the BIPA, please see our BIPA FAQs.

Companies were hoping that the Illinois Supreme Court would ultimately conclude, consistent with the underlying appellate decision, that in order for a plaintiff to bring a claim under the BIPA (i.e. in order for the plaintiff to be considered “aggrieved”) the plaintiff would have to allege actual harm or injury, and not just a procedural or technical violation of the statute.  In reversing and remanding the case, the Illinois Supreme Court held:

The duties imposed on private entities by section 15 of the Act (740 ILCS 14/15) regarding the collection, retention, disclosure, and destruction of a person’s or customer’s biometric identifiers or biometric information define the contours of that statutory right. Accordingly, when a private entity fails to comply with one of section 15’s requirements, that violation constitutes an invasion, impairment, or denial of the statutory rights of any person or customer whose biometric identifier or biometric information is subject to the breach. Consistent with the authority cited above, such a person or customer would clearly be “aggrieved” within the meaning of section 20 of the Act (740 ILCS 14/20) and entitled to seek recovery under that provision. No additional consequences need be pleaded or proved. The violation, in itself, is sufficient to support the individual’s or customer’s statutory cause of action.

The decision is likely to increase the already significant number of suits, including putative class actions, filed under the BIPA.  In the words of the Illinois Supreme Court, “[c]ompliance should not be difficult; whatever expenses a business might incur to meet the law’s requirements are likely to be insignificant compared to the substantial and irreversible harm that could result if biometric identifiers and information are not properly safeguarded; and the public welfare, security, and safety will be advanced.”

An Illinois nursing home is facing a putative class action lawsuit filed by a worker who argues that the facility’s required fingerprint scan for timekeeping poses a threat to their privacy, and violates Illinois’s Biometric Information Privacy Act (“BIPA”). From July 2017 to October 2017, at least 26 employment class actions based on the BIPA have been filed in Illinois state court and show no sign of slowing.

Although some consider Illinois the leader in biometric data protection, other states have enacted laws similar to the BIPA, and still others are considering such legislation. Companies that want to implement technology that uses employee or customer biometric information (for timekeeping, physical security, validating transactions, or other purposes) need to be prepared. For more information on the nursing home case and advise on how to prepare when collecting biometric information, our comprehensive article is available here.

Below are additional resources to help navigate biometric information protection laws:

Not to be outdone by the recent attention to biometric information in Illinois, and the Prairie State’s Biometric Information Privacy Act (BIPA), Washington enacted a biometric data protection statute of its own, HB 1493, which became effective July 23, 2017.

What it notable about Washington’s new biometric information law?

  • It prohibits “persons” from “enrolling” “biometric identifiers” in a database for a “commercial purpose” without first providing notice, obtaining consent, or providing a mechanism to prevent the subsequent use of the biometric identifiers for a commercial purpose. Lots of definitions, more on that below.
  • The exact type of notice and consent should depend on the context, and notice must be given through a procedure reasonably designed to be readably available to affected individuals. Note that the law does not require notice and consent if the person collects, captures, or enrolls a biometric identifier and stores it in a biometric system, or otherwise, in furtherance of a security purpose.
  • In general, a person that has obtained a biometric identifier from an individual and enrolled that identifier may not sell, lease or otherwise disclose the identifier absent consent. There are, of course, some exceptions, such as the disclosure being necessary to provide a product requested by the individual. In addition, a person generally may not use or disclose a biometric identifier for a purpose that is materially inconsistent with the terms under which the identifier was originally provided.
  • Persons that possess biometric identifiers of individuals that have been enrolled for a commercial purpose must (i) have reasonable safeguards to protect against unauthorized access or acquisition to the identifiers, and (ii) not retain the identifiers for longer than is necessary to carry out certain functions, such as providing the product for which the identifier was acquired.
  • There is no private right of action under the new Washington law. It is to be enforced by the state’s Attorney General. Remember that Illinois’ BIPA does permit persons to sue for violations of that law.

To understand how the law applies, one needs to review the defined terms. For example, the term “biometric identifiers” means:

data generated by automatic measurements of an individual’s biological characteristics, such as a fingerprint, voiceprint, eye retinas, irises, or other unique biological patterns or characteristics that is used to identify a specific individual. “Biometric identifier” does not include a physical or digital photograph, video or audio recording or data generated therefrom, or information collected, used, or stored for health care treatment, payment, or operations under the federal health insurance portability and accountability act of 1996.

The law also defines “commercial purpose” to mean:

a purpose in furtherance of the sale or disclosure to a third party of a biometric identifier for the purpose of marketing of goods or services when such goods or services are unrelated to the initial transaction in which a person first gains possession of an individual’s biometric identifier.

And, the term “enroll” means

to capture a biometric identifier of an individual, convert it into a reference template that cannot be reconstructed into the original output image, and store it in a database that matches the biometric identifier to a specific individual.

The use of biometrics and biometric identifiers in commercial transactions and for other purposes is growing, and so is the number of state laws intending to protect that kind of data. Businesses that use or disclose biometrics in carrying out their business should carefully consider whether this new state law applies and, if so, what they need to do to comply.

Capturing the time employees’ work can be a difficult business. In addition to the complexity involved with accurately tracking arrival times, lunch breaks, overtime, etc. across a range of federal and state laws (check out our Wage and Hour colleagues who keep up on all of these issues), many employers worry about “buddy punching” or other situations when time entered into their time management system is entered by a person other than the employee to whom the time relates. To address that worry, some companies have implemented biometric tools to validate time entries. A simple scan of an individual’s fingerprint, for example, can validate that individual is the employee whose time is being entered. But that simple scan can come with some significant compliance obligations, as well as exposure to litigation as discussed in a recent Chicago Tribune article.

The use of biometric data still seems somewhat futuristic and high-tech, but the technology has been around for a while, and there are already a number of state laws addressing the collection, use and safeguarding of biometric information. We’ve discussed some of those here, including the Illinois Biometric Information Privacy Act (BIPA)which is the subject of the litigation referenced above. Notably, the Illinois law permits individuals to sue for violations and, if successful, can recover liquidated damages of $1,000 or actual damages, whichever is greater, along with attorneys’ fees and expert witness fees. The liquidated damages amount increases to $5,000 if the violation is intentional or reckless.

For businesses that want to deploy this technology, whether for time management, physical security, validating transactions or other purposes, there are a number of things to be considered. Here are just a few:

  • Is the company really capturing biometric information as defined under the applicable law? New York Labor Law Section 201-a generally prohibits the fingerprinting of employees by private employers. However, a biometric time management system may not actually be capturing a “fingerprint.” According to an opinion letter issued by the State’s Department of Labor on April 22, 2010, a device that measures the geometry of the hand is permissible as long as it does not scan the surface details of the hand and fingers in a manner similar or comparable to a fingerprint. But, under BIPA, this distinction may not work in some cases. “Biometric information” means any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s biometric identifier used to identify an individual, such as a fingerprint. As a federal district court explained: The affirmative definition of “biometric information” does important work for [BIPA]; without it, private entities could evade (or at least arguably could evade) [BIPA]’s restrictions by converting a person’s biometric identifier into some other piece of information, like a mathematical representation or, even simpler, a unique number assigned to a person’s biometric identifier. So whatever a private entity does in manipulating a biometric identifier into a piece of information, the resulting information is still covered by [BIPA] if that information can be used to identify the person.
  • How long should biometric information be retained? A good rule of thumb – avoid keeping personal information for longer than is needed. The Illinois statute referenced above codifies this rule. Under that law, biometric identifiers and biometric information must be permanently destroyed when the initial purpose for collecting or obtaining such identifiers or information has been satisfied or within 3 years of the individual’s last interaction with the entity collecting it, whichever occurs first.
  • How should biometric information be accessed, stored and safeguarded? Before collecting biometric data, companies may need to provide notice and obtain written consent from the individual. This is the case in Illinois. As with other personal data, if it is accessible to or stored by a third party services provider, the company should obtain written assurances from its vendors concerning such things as minimum safeguards, record retention, and breach response.
  • Is the company ready to handle a breach of biometric data? Currently, 48 states have passed laws requiring notification of a breach of “personal information.” Under those laws, the definitions of personal information vary, and the definitions are not limited to Social Security numbers. A number of them include biometric information, such as Connecticut, Illinois, Iowa and Nebraska. Accordingly, companies should include biometric data as part of their written incident response plans.

The use of biometrics is no longer something only seen in science fiction movies or police dramas on television. It is entering mainstream, including the workplace and the marketplace. Businesses need to be prepared.

Fingerprints, voice prints and vein patterns in a person’s palm are three examples of biometrics that may be “moving into the consumer mainstream to unlock laptops and smartphones, or as a supplement to passwords at banks, hospitals and libraries,” reports Anne Eisenberg at the New York Times. Of course, these technologies, aimed at increasing security and, to a lesser degree, convenience, raise data privacy concerns and other risks. However effective, convenient, and efficient these technologies may be, companies need to think through carefully their adoption and implementation, particularly in the workplace.

Below are just a few of the kinds of questions companies should be asking before implementing technologies that involve capturing biometric information.  It is likely that such technologies will go mainstream and, if so, spawn new laws regulating the use of biometric information. Thus, companies using such technologies will need to continue to monitor the legal landscape to manage their risks.

Can we collect this information? In some cases, the answer may be no. For example, in New York, Labor Law Section 201-a prohibits the fingerprinting of employees by private employers, unless required by law. However, according to an opinion letter issued by the State’s Department of Labor on April 22, 2010, a device that measures the geometry of the hand is permissible as long as it does not scan the surface details of the hand and fingers in a manner similar or comparable to a fingerprint. Other states may permit the collection of biometric information provided certain steps are taken. The Illinois Biometric Information Privacy Act, for instance, prohibits private entities from obtaining a person’s or customer’s biometric identifier or biometric information unless the person is informed in writing and consents in writing.

If we can collect it, do we have to safeguard it?  Regardless of whether a statute requires a business to safeguard such information, we believe it is good practice to do so. However, states such as Illinois (see above) already require a reasonable standard of care when storing, transmitting or disclosing biometric information.

Is there a notification obligation if unauthorized persons get access to biometric information? In some states the answer is yes.  The breach notification statutes in states such as Michigan include biometric data in the definition of personal information. MCLS § 445.72

Are there any requirements for disposing of this information? Yes, a number of states (e.g., Colorado and Massachusetts) require that certain entities meet minimum standards for properly disposing records containing biometric information.

Can employees claim this technology amounts to some form of discrimination? In addition to securing devices and accounts, biometric technologies also are being used to track employee time and attendance in order to enhance workforce management. These different applications can form the basis of discrimination claims. For example, earlier in 2013, the U.S. Equal Employment Opportunity Commission (EEOC) claimed an employer’s use of a biometric hand scanner to track employee time and attendance violated federal law by failing to accommodate certain religious beliefs which opposed the use of such devices.

Retinal scan technology is another biometric technology that can be used for identification/security purposes.  However, as explained in a recent Biometric.com article, “examining the eyes using retinal scanning can aid in diagnosing chronic health conditions such as congestive heart failure and atherosclerosis…[as well as] diseases such as AIDS, syphilis, malaria, chicken pox and Lyme disease [and] hereditary diseases, such as leukemia, lymphoma, and sickle cell anemia.” Thus, the data captured by such scans can inform employers about the health conditions of their employees, raising a range of medical privacy, medical inquiry and discrimination issues under federal and state laws, such as the Americans with Disabilities Act. 

As Data Privacy Day 2026 approaches, organizations face an inflection point in privacy, artificial intelligence, and cybersecurity compliance. The pace of technological adoption, in particular AI tools, continues to outstrip legal, governance, and risk frameworks. At the same time, regulators, plaintiffs, and businesses are increasingly focused on how data is collected, used, monitored, and safeguarded.

Below are our Top 10 Privacy, AI, and Cybersecurity Issues for 2026.

1. AI Governance Becomes Operational and Enforceable

AI governance in 2026 will be judged less by aspirational principles and more by documented processes, controls, and accountability. Organizations using AI for recruiting, managing performance, improving efficiency and security, and creating content, among a myriad of other use cases, will be expected to demonstrate how AI systems are developed, deployed, and governed, considering a global patchwork of existing and emerging laws and regulations affecting AI and related technologies.

Action items for 2026:

  • Maintain an enterprise AI inventory, including shadow or embedded AI features.
  • Classify AI systems by risk and use case (HR, monitoring, security, consumer-facing)
  • Establish cross-functional AI governance (legal, privacy/infosec, HR, marketing, finance, operations)
  • Implement documentation and review processes for high-risk AI systems.

Learn More:

2. AI-Driven Workplace Monitoring Under Scrutiny

AI-enabled monitoring tools (dashcams, performance management solutions, wearables, etc.) are increasingly used to track productivity, behavior, communications, and engagement. These tools raise heightened concerns around employee privacy, fairness, transparency, and proportionality, especially when AI generates insights or scores that influence employment decisions.

Regulators and plaintiffs are paying closer attention to whether monitoring is over-collection by design, and whether AI outputs are explainable and defensible.

Action items for 2026:

  • Audit existing monitoring and productivity tools for AI functionality.
  • Assess whether monitoring practices align with data minimization principles.
  • Update employee notices and policies to clearly explain AI-driven monitoring.
  • Ensure human review and appeal mechanisms for AI-influenced decisions.

Learn More:

3. Biometrics Expand and So Does Legal Exposure

Biometric data collection continues to expand beyond fingerprints and facial recognition to include voiceprints, behavioral identifiers, and AI-derived biometric inferences. Litigation under Illinois’ Biometric Information Privacy Act (BIPA) remains active, but risk is spreading through broader definitions of sensitive data in state privacy laws.

Action items for 2026:

  • Identify all biometric and biometric-adjacent data collected directly or indirectly.
  • Review vendor tools to ensure compliance.
  • Update biometric notices, consent processes, and retention schedules.
  • Align biometric compliance efforts with broader privacy programs.

Learn More:

4. CIPA Litigation and Website Tracking Technologies Continue to Evolve

California Invasion of Privacy Act (CIPA) litigation related to session replay tools, chat features, analytics platforms, and tracking pixels remains a major risk area, even as legal theories evolve. AI-enhanced tracking tools that capture richer interactions only heighten exposure. Organizations often underestimate the privacy implications of seemingly routine website and chatbot technologies.

Action items for 2026:

  • Conduct a comprehensive audit of website and app tracking technologies.
  • Reassess consent banners, disclosures, and opt-out mechanisms.
  • Evaluate AI-enabled chatbots and analytics for interception risks.
  • Monitor litigation trends and adjust risk tolerance accordingly.

Learn More:

5. State Comprehensive Privacy Laws Enter an Implementation and Enforcement Phase

Organizations are no longer preparing for state privacy laws, but they are living under them. The California Consumer Privacy Act (CCPA), along with other state laws, imposes increasing operational obligations.

California’s risk assessment requirements, cybersecurity audit mandates, and automated decision-making technology (ADMT) regulations represent a significant shift toward proactive compliance.

Action items for 2026:

  • Comply with annual review and update requirements.
  • Conduct CCPA-mandated risk assessments for high-risk processing.
  • Prepare for cybersecurity audit obligations and documentation expectations.
  • Inventory and assess ADMT used in employment, monitoring, and consumer contexts.

Learn More:

6. Data Minimization Becomes One of the Most Challenging Compliance Obligations

Data minimization has moved from an abstract compliance principle to a central operational challenge. Modern AI systems, monitoring tools, and security platforms are frequently architected to collect and retain expansive datasets by default, even when narrower data sets would suffice. This design approach increasingly conflicts with legal obligations that require organizations to limit data collection to what is necessary, proportionate, and purpose-specific, not only in terms of retention, but at the point of collection itself. As regulatory scrutiny intensifies, organizations must be prepared to explain why specific categories of data were collected, how those decisions align with defined business purposes, and whether less intrusive alternatives were reasonably available.

Action items for 2026:

  • Reassess data collection across AI, HR, and security systems.
  • Implement retention limits and transfer restrictions tied to business necessity and legal risk.
  • Challenge “collect now, justify later” deployments that rely on large-scale or continuous data exports.
  • Integrate data minimization and Bulk Data Transfer rule analysis into AI governance and system design reviews.

Learn More:

7. Importance of the DOJ Bulk Transfer Rule

In 2026, bulk sensitive data transfers are no longer a background compliance issue but a regulated risk category in their own right. Under the Department of Justice’s Bulk Data Transfer Rule, which took effect in 2025, organizations must closely assess whether large-scale transfers or access to U.S. sensitive personal or government-related data involve countries of concern or covered persons. The rule reaches a wide range of transactions, including vendor, employment, and service arrangements, and imposes affirmative obligations around due diligence, access controls, and ongoing monitoring.

Action items for 2026:

  • Update data mapping activities to include sensitive data collection and data storage.
  • Catalog where bulk data transfers occur, including transfers between internal systems, vendors, and cross-border environments. Develop a compliance program that includes due diligence steps, vendor agreement language, and internal access controls.
  • Evaluate the purpose of each bulk transfer.

Learn More:

8. UK and EU Data Protection Laws Reforms

Recent and proposed amendments to UK and EU data protection laws are designed to clarify or simplify compliance obligations for organizations, regardless of sector. Changes will impact both commercial and workplace data handling practices.   

UK: Data Use and Access Act (DUAA)

The UK has enacted the Data Use and Access Act, which amends key provisions of the UK General Data Protection Regulation (UK GDPR) and the Privacy and Electronic Communications Regulations (PECR). These reforms relate to subject access requests and complaints, automated processing, the lawful basis to process, cookies, direct marketing, and cross-border transfers, among others. Implementation is occurring in stages, with changes relating to subject access requests, complaints, and automated decision-making taking effect over the next few months.

EU: Digital Omnibus Regulation

The European Commission has proposed a Digital Omnibus Regulation, which introduces amendments to the EU General Data Protection Regulation. Proposed changes include redefining “personal data”, simplifying the personal data breach notification process, clarifying the data subject access process, and managing cookies.

Action items for 2026:

  • Review forthcoming guidance from the UK Information Commissioner’s Office.
    • Implement a data subject complaint process.
    • Review existing lawful bases and purposes for processing.
    • Prepare any necessary updates for employee training.
  • Monitor the progress of the proposed Digital Omnibus Regulation.
    • Review data inventories in the event the definition of personal data is revised.
    • Update data subject access response processes.
    • Review the use and nature of any cookies deployed on the organization’s website.

Learn More:

9. Vendor and Third-Party AI Risk Management Intensifies

Most organizations buy rather than build AI technologies. They buy from vendors such as recruiting platforms, notetaking tools, monitoring applications, cybersecurity providers, and analytics services—whose systems depend on large-scale data ingestion. From procurement to MSA negotiation to record retention obligations, novel and challenging issues as organizations seek to minimize third-party and fourth-party service provider risk. Importantly, vendor contracts have not kept pace with the nature of AI models or how to allocate risk.

Action items for 2026:

  • Update vendor diligence to include privacy, security, and AI-specific risk assessments.
  • Revise contracts to address AI training data, secondary use, audit rights, and allocation of liability.
  • Monitor downstream data sharing, model updates, and cross-border or large-scale data movements.

Learn More:

10. Privacy, AI, and Cybersecurity Fully Converge

In 2026, the lines between privacy, cybersecurity, and AI will continue to blur, leaving organizations that silo these disciplines to face increasing regulatory, litigation, and operational risk.

Action items for 2026:

  • Integrate privacy, AI governance, and cybersecurity leadership.
  • Harmonize risk assessments and reporting structures.
  • Align training and compliance messaging across functions.
  • Treating privacy and AI governance as enterprise risk issues.

Learn More:

As Data Privacy Day 2026 highlights, the challenge is no longer identifying emerging risks, but it is managing them at scale, across systems, and in real time. AI, biometrics, monitoring technologies, and expanding privacy laws demand a more mature, integrated approach to compliance and governance.

A blend of evolving judicial interpretation, aggressive plaintiffs’ counsel, and decades-old statutory language has brought new life to the Florida Security of Communications Act (FSCA) as a vehicle for challenging commonplace website technologies.

At its core, the FSCA was enactedto protect privacy by prohibiting the unauthorized interception of wire, oral, or electronic communications — with far stricter requirements than federal law. Unlike the federal Wiretap Act (which allows one-party consent), Florida typically requires all-party consent before recording or intercepting electronic communications. The FSCA also generally prohibits the interception of any wire, oral, or electronic communications, as well as the use and disclosure of unlawfully intercepted communications “knowing or having reason to know that the information was obtained through the interception of a wire, oral, or electronic communication.”

The New Wave of FSCA Claims

For plaintiffs, an attractive provision of the FSCA is that actual damages need not be established to recover for violations. Under the FSCA, a plaintiff can recover liquidated damages of at least $1,000 for violations without a showing of actual harm, as well as punitive damages and attorneys’ fees. One need only examine the explosion of litigation under other laws with similar damages provisions (e.g., the California Invasion of Privacy Act (CIPA), Telephone Consumer Protection Act (TCPA), Illinois Biometric Information Privacy Act (BIPA), the Illinois Genetic Information Privacy Act (GIPA)) to see this model in action.

For years, courts were reluctant to apply the FSCA to digital technologies like website trackers or analytics tools. Courts routinely dismissed early FSCA lawsuits targeting session-replay software and cookies—finding that these tools didn’t intercept the “contents” of communications in a manner the statute was meant to reach. See Jacome v. Spirit Airlines, Inc., No. 2021-000947-CA-01 (Fla. 11th Cir. Ct. June 17, 2021). This view may be shifting.

Recent cases suggest courts may be more open to digital wiretapping-type claims brought in Florida that previously indicated.

  • A nationwide class action pending in the Southern District of Florida, Cobbs v. PetMed Express, Inc.,  alleges that PedMed Express,  an online veterinary pharmacy, used embedded tracking technologies that enabled third-party companies to capture information about consumers’ prescription-related browsing and purchase activity  on its website.   The tracking tools allegedly intercepted URLs, search queries, and personally identifiable information such as email addresses and phone numbers.   This case highlights the growing litigation risks associated with embedded website tracking technologies – particularly when sensitive data such as prescription or health-related information is involved.
  • In Magenheim v. Nike, Inc., filed in December 2025  in the Southern District of Florida, the plaintiffs allege that Nike triggered undisclosed tracking technologies on visitors’ web browsers immediately upon visiting the website – before users could review privacy disclosures or provide consent – and even when users enabled Global Privacy Control (GPC) signals or selected do not share my data on the site.   This lawsuit seeks class certification to include all Florida visitors to Nike’s website over the past two years.  This case underscores the increasing litigation risk surrounding online privacy expectations and the handling of browser-based tracking data.
  • In a lawsuit filed against a large health system in Florida and pending before the U.S. District Court for the Middle District of Florida, the plaintiff, a patient of that health system, alleges that the hospital system embedded tracking technologies within its website and patient portal.   As plead in the putative class action,  the tracking tools allegedly intercepted patients’ online queries regarding symptoms, treatments and other health related content.   The FSCA claims and the federal Wiretap Act survived a motion to dismiss, inline with the growing trend of courts scrutinizing the use of tracking technologies – particularly in the health care context.

What Courts Are Grappling With

At the heart of these disputes are questions that courts nationwide are wrestling with:

  • What constitutes an “interception” under an analog-era statute when applied to digital data?
  • Do URLs, clicks, form inputs, and other web interactions qualify as the “contents” of communications protected by wiretapping laws?
  • When (and whether) consent provided via privacy notices or cookie banners is sufficient to defeat a statutory wiretapping claim?

Courts have reached different answers, leaving Florida business in limbo with the uncertainty driving increasing claims from plaintiffs.

What This Means for Your Business

Whether you operate a website, mobile app, or digital marketing campaign, the Florida FSCA litigation trend shows no signs of slowing. To mitigate risks and avoid becoming a target of wiretapping claims, consider the following practical steps:

1. Audit All Tracking Technologies

Inventory all third-party pixels, session-replay tools, analytics scripts, and email tracking. Understand what data they capture, when it’s transmitted, and what third parties receive it.

2. Reevaluate Your Consent Mechanisms

Passive privacy disclosures may not be enough. Use clear, affirmative consent mechanisms (e.g., click-to-accept banners) that disclose what is collected and how it is used before any tracking occurs.

3. Limit Data to What’s Necessary – Minimization

Where possible, restrict the capture of high-risk data (e.g., URLs revealing sensitive information or form content) and weigh whether aggressive tracking is essential for business purposes.

4. Update Privacy Policies and Terms

Make your data collection and sharing practices transparent and easily accessible. Regularly update legal disclosures to mirror how tools actually function.

5. Tighten Vendor Contracts

Ensure contracts with analytics, marketing, and tracking vendors allocate compliance responsibility and include indemnification clauses where appropriate.

6. Monitor Legal Developments

Florida’s legal landscape is shifting rapidly. Maintain awareness of new decisions and legislative changes that may clarify or expand FSCA applicability.

Conclusion

The surge of digital wiretapping claims under the Florida Security of Communications Act illustrates how old statutes can take on new life in an era of ubiquitous data collection. What once was a niche privacy theory now threatens to expose businesses — large and small — to class action exposure and costly litigation.

By understanding the evolving legal landscape and implementing proactive compliance strategies, companies can better safeguard their digital practices and reduce the risk of costly FSCA claims.