A putative class action filed in December 2025 in the U.S. District Court for the Central District of Illinois offers a reminder that AI meeting assistant and transcription tools potentially carry significant legal exposure when organizations deploy them without appropriate governance guardrails in place. It also serves as a reminder to apply strong governance principles when evaluating and deploying these and similar technologies.
What the Fireflies.AI Complaint Alleges
The plaintiff in Cruz v. Fireflies.AI Corp., No. 3:25-cv-03399 (C.D. Ill.), alleges that she participated in a virtual meeting hosted by an Illinois nonprofit organization that had enabled Fireflies.ai — a popular AI meeting assistant that automatically joins Zoom, Microsoft Teams, and Google Meet sessions to record, transcribe, and analyze conversations. She alleges she never created a Fireflies account, never agreed to its Terms of Service, and never provided any written consent authorizing the collection of her biometric data.
Several states, most notably Illinois (under its Biometric Information Privacy Act (“BIPA”), regulate the collection and processing of biometric identifiers and information, creating significant compliance and litigation risk. Check out our summary of that regulation.
The crux of the BIPA claims is straightforward: Fireflies’ “Speaker Recognition” feature, marketed as able to identify different speakers in meetings and audio files, necessarily generates voiceprints — biometric identifiers expressly covered by BIPA. The complaint alleges Fireflies violated BIPA in three distinct respects:
- failing to maintain and publicly publish a retention schedule and destruction policy for biometric data;
- failing to inform participants in writing that voiceprints were being collected, or of the purpose and duration of that collection; and
- collecting voiceprints without obtaining a written release from participants — including non-account holders who were simply present in recorded meetings.
The plaintiff seeks statutory damages of $1,000 per negligent violation and $5,000 per reckless or intentional violation, plus attorneys’ fees and injunctive relief.
Why This Matters For and Well Beyond the Fireflies Litigation
In the case of AI meeting and transcription tools, consider the following use cases along with the potential legal and other risks if, in fact, the tools are capturing biometric information:
- Trainings. When multiple employees use the same workstation or conference room to join trainings using an AI transcription tool, voiceprints of each participant may be captured. Unless each individual has provided written consent, exposure compounds with each meeting and each attendee.
- Witness and investigation interviews. HR professionals and corporate investigators increasingly use AI transcription tools to document and summarize interviews.
- Applicant interviews. Talent acquisition teams using AI notetakers during candidate interviews may be capturing the voiceprints of applicants who are unlikely to have been informed their biometric data is being processed.
- Patient and client encounters. Healthcare providers and other licensed medical professionals using AI transcription in clinical or counseling settings face layered risk — HIPAA, state privacy laws, and where applicable, biometric information protections.
Even beyond meeting assistant and transcription tools, as similar technologies are embedded into a myriad of devices and applications, questions about the collection of biometric information arise. Examples include performance management platforms and AI glasses, both of which can capture and record audio and video.
Clearly, the allegations in Cruz highlight a risk that extends far beyond any one technology, one use case, or the law in one state. AI meeting and transcription tools, like many emerging technologies, potentially can provide substantial productivity and other benefits to an organization. Compelling evidence of this is the rapid implementation and deployment for a wide range of use cases. Whether biometric identifiers or information are collected by Fireflies.AI remains to be seen. What we do know is rolling out technology without appropriate due diligence can expose an organization to significant compliance and litigation risk.
Governance Takeaways
- Adopt a team approach to due diligence. The data privacy and security challenges presented by complex and easily adaptable technologies cannot be solved by the IT department alone. Technology safeguards are critical, but they do not replace strong administrative, physical, and organizational controls, nor do they address the nuances certain applications bring. Organization executives, legal counsel, HR professionals, risk, and other key stakeholders should be at the table to ensure the right questions are being asked.
- Know your legal and contractual limitations, and the various ways they could apply. An organization’s compliance team need not and should not be comprised solely of lawyers. But it should maintain a keen awareness of the various legal and contractual limitations on the use of certain technologies, and their potential use cases.
- Change management. It is increasingly common to vet and deploy a technology for one application, only to discover that it can easily address a number of other unrelated problems. Or, the vendor developing the technology significantly expands its functionality, which could be very beneficial to the organization beyond its current use. Will those pursuing the additional use cases or functionalities reopen the due diligence analysis? They should.
- Write it down. Even if the organization is checking off items 1-3 above, building some structure around the process can help to ensure it is ongoing and consistent.