In 2024, Colorado passedthe first comprehensive state-level law in the U.S. regulating the use of artificial intelligence, the Artificial Intelligence Act (the Act). It imposed strict requirements on developers and users of “high-risk” AI systems, particularly in sectors like employment, housing, finance, and healthcare. The Act drew criticism for its complexity, breadth, and potential to stifle innovation.

In early 2025, lawmakers introduced Senate Bill (SB) 25-318 as a response to growing concerns from the tech industry, employers, and even Governor Jared Polis, who reluctantly signed the Act into law last year.

SB25-318 aimed to soften and clarify some of the more burdensome aspects of the original legislation before its compliance deadline of February 1, 2026.

Amendments proposed under SB 25-318 included:

  • An exception to the definition of “developer” if the person offers an AI system with open model weights and meets specified conditions.
  • Exemptions for specified technologies.
  • Elimination of the duty of a developer or deployer to use reasonable care to protect consumers from known or reasonably foreseeable risks of algorithmic discrimination and the requirement to notify the state attorney general of such risk.
  • An exemption from specified disclosure requirements for developers if they meet certain financial and operational criteria.

Despite its intention to strike a balance between innovation and regulation, SB25-318 was voted down 5-2 by the Senate Business, Labor, and Technology Committee on May 5, 2025.

With SB25-318 dead, the original Act remains intact, and the next step is for the Colorado Attorney General to issue rules and/or guidance.  As it now stands, businesses and developers operating in Colorado must prepare for full compliance by early 2026 unless this date is otherwise extended.    

If you have questions about compliance with Colorado’s Artificial Intelligence Act or related issues, contact a Jackson Lewis attorney to discuss.

Online retailer Harriet Carter Gifts recently obtained summary judgment from the district court in a class action under Pennsylvania wiretap law. At the heart of this case is the interpretation and application of the Pennsylvania Wiretapping and Electronic Surveillance Control Act of 1978 (WESCA), a statute designed to regulate the interception of electronic communications. The court’s primary task was to determine whether the actions of Harriet Carter Gifts and NaviStone constituted an unlawful interception under this law.

In 2021, the district court sided with the defendants, granting summary judgment because NaviStone was a direct party to the communications, and thus, no interception occurred under WESCA. However, the Third Circuit Court of Appeals overturned this decision. The appellate court clarified that there is no broad direct-party exception to civil liability under WESCA. Consequently, the case was remanded to determine “whether there is a genuine issue of material fact about where the interception occurred.”

On remand, the district court examined whether Popa could be deemed to have consented to the interception of her data by NaviStone through the privacy policy posted on Harriet Carter’s website. The court focused on whether the privacy policy was sufficiently conspicuous to provide constructive notice to Popa.

The enforceability of browsewrap agreements, which are terms and conditions posted on a website without requiring explicit user consent, was another critical aspect of the case. The court found that Harriet Carter’s privacy policy was reasonably conspicuous and aligned with industry standards. The court noted that the privacy policy was linked in the footer of every page on the Harriet Carter website, labeled “Privacy Statement,” and was in white font against a blue background. This placement was consistent with common industry practices in 2018 when the violation was alleged, which typically involved placing privacy policies in the footer of websites.

This led the court to conclude that Popa had constructive notice of the terms, reinforcing the notion of implicit consent. Notably, the court found implicit consent without any evidence that Popa had actual knowledge of the terms of the privacy statement.  Rather, the court found a reasonably prudent person would be on notice of the privacy statement’s terms. 

Based on these findings, the court granted summary judgment in favor of the defendants. The court determined that Popa’s WESCA claim failed because she had implicitly consented to the interception by NaviStone, as outlined in Harriet Carter’s privacy statement. 

The case of Popa vs. Harriet Carter Gifts, Inc. and NaviStone, Inc. emphasizes the necessity for clear and accessible privacy policies in the digital era. It also brings attention to the complex legal issues related to user consent and the interception of electronic communications. For inquiries regarding compliance with consumer consent requirements for websites, please contact a Jackson Lewis attorney to discuss further.

In late March 2025, the Florida Bar Board of Governors unanimously endorsed the recommendation of its Special Committee on Cybersecurity and Privacy Law that law firms should adopt written incident response plans (IRPs) to better prepare for and respond to data security incidents. The recommendation reflects a growing recognition across professional service industries—particularly law firms—of the serious risks posed by cyber threats and the need for structured, proactive responses.

The message is simple: law firms must be prepared.

As most practitioners will observe, it is not a matter of if an organization will experience a data breach, but when. Development and implementation of an IRP can be challenging as the nature of legal practice poses unique challenges, even for smaller firms. At the same time, as stewards of vast amounts of highly sensitive client and employee data, often spanning multiple industries, jurisdictions, and confidentiality regimes, such data sets make law firms attractive targets for threat actors, especially those seeking access to intellectual property, litigation strategies, and regulatory or financial information, not to mention sensitive personal information.

What Makes Law Firms Different?

Unlike organizations in several other industries, law firms often lack centralized compliance infrastructures or in-house technical expertise. Client confidentiality obligations and the attorney-client privilege can complicate both the detection and disclosure of incidents. In some cases, firm may confuse confidentiality for security, when both are needed.

In addition, unlike most other professional service providers, law firms grapple with a set of comprehensive rules of professional responsibility that increasingly delve into data privacy and cybersecurity issues. Of course, those rules sit on top of generally applicable business regulation that law firms also face. See, for example, our recent discussion about the Florida Information Protection Act (FIPA) which mandates that certain entities, including law firms, implement reasonable measures to protect electronic data containing personal information.

When engaging a new client, a simple engagement letter may no longer be sufficient, especially for law firms representing certain businesses, particularly those that are heavily regulated. Consider law firms that defend medical malpractice claims. Their clients are most likely healthcare providers covered by the privacy and security regulations under the Health Insurance Portability and Accountability Act (HIPAA). That makes these firms “business associates” to the extent those services involve access to “protected health information.” Just like their healthcare provider clients, business associate law firms are required to maintain an incident response plan. 45 CFR 164.308(a)(6). So, even before Recommendation 25-1, many law firms may have already been obligated to maintain an IRP, at least with respect to certain information collected from or on behalf of certain clients.

Given these realities for law firms, the Florida Bar’s recommendation is both timely and necessary, even if not unprecedented. Notably, in 2018, the ABA issued Formal Opinion 483 which made a similar recommendation. Law firms considering an IRP should consult Formal Opinion 483.

What Should a Law Firm’s Incident Response Plan Include?

A comprehensive and tailored IRP should be risk-based and scalable to firm size, practice areas, and existing infrastructure. Here are some components all firms should consider including in their IRP:

  1. Governance and Roles
    Define the internal response team and assign roles, including legal, IT, communications, HR, and leadership. Identify outside partners, such as breach counsel, forensics, and public relations firms.
  2. Data Mapping and Risk Assessment
    Map the data your firm collects, stores, and shares. Understand where sensitive client and employee information resides and how it is secured. A risk assessment will help prioritize which systems and data are most critical.
  3. Incident Detection and Reporting
    Establish processes for identifying, reporting, and escalating suspected incidents. Time is critical when responding to ransomware, business email compromise, or other attacks. Remember to have a plan to communication outside of the firm’s existing systems which may not be operable.
  4. Investigation and Containment
    Outline steps to contain and investigate an incident, including coordination with law enforcement, insurance carriers, forensic investigators, and legal advisors.
  5. Notification and Legal Obligations
    Address client communications, breach notification laws, ethical duties, and contractual terms that may require specific responses.
  6. Post-Incident Review and Testing
    After resolving an incident, assess what went well and what needs improvement. Regular tabletop exercises and plan updates are essential.

Additional Tools and Resources

With cyber threats evolving and legal obligations expanding, law firms must treat incident response planning as an ethical, professional, and business imperative. The Florida Bar’s recommendation should serve as a wake-up call. By building a strong IRP, law firms can better protect client confidences, meet regulatory requirements, and preserve their professional reputation.

On March 24, 2025, Virginia’s Governor vetoed House Bill (HB) 2094, known as the High-Risk Artificial Intelligence Developer and Deployer Act. This bill aimed to establish a regulatory framework for businesses developing or using “high-risk” AI systems.

The Governor’s veto message emphasized concerns that HB 2094’s stringent requirements would stifle innovation and economic growth, particularly for startups and small businesses. The bill would have imposed nearly $30 million in compliance costs on AI developers, a burden that could deter new businesses from investing in Virginia. The Governor argued that the bill’s rigid framework failed to account for the rapidly evolving nature of the AI industry and placed an onerous burden on smaller firms lacking large legal compliance departments.

The veto of HB 2094 in Virginia reflects a broader debate in AI legislation across the United States. As AI technology continues to advance, both federal and state governments are grappling with how to regulate its use effectively.

At the federal level, AI legislation has been marked by contrasting approaches between administrations. Former President Biden’s Executive Orders focused on ethical AI use and risk management, but many of these efforts were revoked by President Trump this year. Trump’s new Executive Order, titled “Removing Barriers to American Leadership in Artificial Intelligence,” aims to foster AI innovation by reducing regulatory constraints.

State governments are increasingly taking the lead in AI regulation. States like Colorado, Illinois, and California have introduced comprehensive AI governance laws. The Colorado AI Act of 2024, for example, uses a risk-based approach to regulate high-risk AI systems, emphasizing transparency and risk mitigation. While changes to the Colorado law are expected before its 2026 effective date, it may emerge as a prototype for others states to follow. 

Takeaways for Business Owners

  1. Stay Informed: Keep abreast of both federal and state-level AI legislation. Understanding the regulatory landscape will help businesses anticipate and adapt to new requirements.
  2. Proactive Compliance: Develop robust AI governance frameworks to ensure compliance with existing and future regulations. This includes conducting risk assessments, implementing transparency measures, and maintaining proper documentation.
  3. Innovate Responsibly: While fostering innovation is crucial, businesses must also prioritize ethical AI practices. This includes preventing algorithmic discrimination and ensuring the responsible use of AI in decision-making processes.

If you have questions about compliance with AI regulation, contact a member of our Artificial Intelligence Group or the Jackson Lewis attorney with whom you regularly work.

On Friday, the U.S. Department of Health and Human Services (HHS), Office for Civil Rights (OCR) announced the fifth enforcement action under its Risk Analysis Initiative. In this case, OCR reached a settlement with Health Fitness Corporation (Health Fitness), a wellness vendor providing services to employer-sponsored group health plans.

This announcement is interesting for several reasons. It furthers the OCR’s Risk Analysis Initiative. The enforcement action is a reminder to business associates about HIPAA compliance. The development also points to a significant development under ERISA for plan fiduciaries and service providers to their plans.

The OCR Risk Analysis Initiative

Anyone who takes a look at prior OCR enforcement actions will notice several trends. One of those trends relates to enforcement actions following a data breach. In those cases, the OCR frequently alleges the target of the action failed to satisfy the risk analysis standard under the Security Rule. This standard is fundamental – it involves assessing the threats and vulnerabilities to electronic protected health information (ePHI), a process that helps to shape the covered entity or business associate’s approach to the other standards, and goes beyond a simply gap analysis.

“Conducting an accurate and thorough risk analysis is not only required but is also the first step to prevent or mitigate breaches of electronic protected health information,” said OCR Acting Director Anthony Archeval.  “Effective cybersecurity includes knowing who has access to electronic health information and ensuring that it is secure.”

For those wondering how committed the OCR is to its enforcement initiatives, you need not look further than its Right to Access Initiative. On March 6, 2025, the agency announced its 53rd enforcement action. According to that announcement, it involved a $200,000 civil monetary penalty imposed against a public academic health center and research university for violating an individual’s right to timely access her medical records through a personal representative.

The DOL Cybersecurity Rule

Businesses that sponsor a group health plan or other ERISA employee benefit plans might want to review the OCR’s announcement and resolution agreement concerning Health Fitness a little more carefully. In 2024, the DOL’s Employee Benefits Security Administration (EBSA) issued Compliance Assistance Release No. 2024-01. That release makes clear that the fiduciary obligation to assess the cybersecurity of plan service providers applies to all ERISA-covered employee benefit plans, including wellness programs for group health plans.

OCR commenced it investigation of Health Fitness after receiving four reports from Health Fitness, over a three-month period (October 15, 2018, to January 25, 2019), of breaches of PHI.  According to the OCR, “Health Fitness reported that beginning approximately in August 2015, ePHI became discoverable on the internet and was exposed to automated search devices (web crawlers) resulting from a software misconfiguration on the server housing the ePHI.” Despite these breaches, according to the OCR, Health Fitness had failed to conduct an accurate and thorough risk analysis, until January 19, 2024.

For Health Fitness, it agreed to implement a corrective action plan that OCR will monitor for two years and paid $227,816 to OCR. For ERISA plan fiduciaries, an important question is what they need to do to assess the cybersecurity of plan service providers like Health Fitness during the procurement process and beyond.

We provide some thoughts in our earlier article and want to emphasize that plan fiduciaries need to be involved in the process. Cybersecurity is often a risk left to the IT department.  However, doing so may leave even the most ardent IT professional ill equipped or insufficiently informed about the threats and vulnerabilities of the particular service provider. When it come to ERISA plans, this means properly assessing the threats and vulnerabilities as they relate to the aspects of plan administration being handled by the service provider.

Third-party plan service providers and plan fiduciaries should begin taking reasonable and prudent steps to implement safeguards that will adequately protect plan data. EBSA’s guidance should help the responsible parties get there, along with the plan fiduciaries and plan sponsors’ trusted counsel and other advisors.

In February, a coalition of healthcare organizations sent a letter to President Donald J. Trump and the U.S. Department of Health and Human Services (HHS) (the Letter), urging the immediate rescission of a proposed update to the Security Rule under HIPAA. The update is aimed at strengthening safeguards for securing electronic protected health information.

According to The HIPAA Journal, the data breach trend in the healthcare industry over the past 14 years is up, not down. This is the case despite the HIPAA Security Rule having been in effect since 2005.

The HIPAA Journal goes on to provide some sobering statistics:

Between October 21, 2009, when OCR first started publishing summaries of data breach reports on its “Wall of Shame”, and and December 31, 2023, 5,887 large healthcare data breaches have been reported. On January 22, 2023, the breach portal listed 857 data breaches as still; under investigation. This time last year there were 882 breaches listed as under investigation, which shows OCR has made little progress in clearing its backlog of investigations – something that is unlikely to change given the chronic lack of funding for the department.

There have been notable changes over the years in the main causes of breaches. The loss/theft of healthcare records and electronic protected health information dominated the breach reports between 2009 and 2015. The move to digital record keeping, more accurate tracking of electronic devices, and more widespread adoption of data encryption have been key in reducing these data breaches. There has also been a downward trend in improper disposal incidents and unauthorized access/disclosure incidents, but data breaches continue to increase due to a massive increase in hacking incidents and ransomware attacks. In 2023, OCR reported a 239% increase in hacking-related data breaches between January 1, 2018, and September 30, 2023, and a 278% increase in ransomware attacks over the same period. In 2019, hacking accounted for 49% of all reported breaches. In 2023, 79.7% of data breaches were due to hacking incidents.

The letter, signed by numerous healthcare organizations, outlines several key concerns regarding the proposed HIPAA Security Rule update, including:

  1. Financial and Operational Burdens: The letter argues that the proposed regulation would impose significant financial and operational burdens on healthcare providers, particularly those in rural areas. The unfunded mandates associated with the new requirements could strain the resources of hospitals and healthcare systems, leading to higher healthcare costs for patients and reduced investment in other critical areas.
  2. Conflict with Existing Law: The Letter points to an amendment to the Health Information Technology for Economic and Clinical Health (HITECH) Act, arguing the proposed enhancements to the Security Rule conflict with the HITECH Act amendment. However, the HITECH Act amendment sought to incentivize covered entities to adopt “recognized security practices” that might minimize (not necessarily eliminate) remedies for HIPAA Security Rule violations and the length and extent of audits and investigations.
  3. Timeline and Feasibility: The letter highlights concerns about the timeline for implementing the proposed requirements. The depth and breadth of the new mandates, combined with an unreasonable timeline, present significant challenges for healthcare providers. 

No doubt, the Trump Administration is intent on reducing regulation on business. However, it will be interesting to see whether it softens or even eliminates the proposed rule in response to the Letter, despite the clear trend of more numerous and damaging data breaches in the healthcare sector, and an increasing threat landscape facing all U.S. businesses.

According to one survey, Florida is fourth on the list of states with the most reported data breaches. No doubt, data breaches continue to be a significant risk for all business, large and small, across the U.S., including the Sunshine State. Perhaps more troubling is that class action litigation is more likely to follow a data breach. A common claim in those cases – the business did not do enough to safeguard personal information from the attack. So, Florida businesses need to know about the Florida Information Protection Act (FIPA) which mandates that certain entities implement reasonable measures to protect electronic data containing personal information.

According to a Law.com article:

The monthly average of 2023 data breach class actions was 44.5 through the end of August, up from 20.6 in 2022.

While a business may not be able to completely prevent a data breach, adopting reasonable safeguards can minimize the risk of one occurring, as well as the severity of an attack. Additionally, maintaining reasonable safeguards to protect personal information strengthens the businesses’ defensible position should it face an government agency investigation or lawsuit after an attack.  

Entities Subject to FIPA

FIPA applies to a broad range of organizations, including:

   •    Covered Entities: This encompasses any sole proprietorship, partnership, corporation, or other legal entity that acquires, maintains, stores, or uses personal information…so, just about any business in the state. There are no exceptions for small businesses.

   •    Governmental Entities: Any state department, division, bureau, commission, regional planning agency, board, district, authority, agency, or other instrumentality that handles personal information.

   •    Third-Party Agents: Entities contracted to maintain, store, or process personal information on behalf of a covered entity or governmental entity. This means that just about any vendor or third party service provider that maintains, stores, or processes personal information for a covered entity is also covered by FIPA.

Defining “Reasonable Measures” in Florida

FIPA requires:

Each covered entity, governmental entity, or third-party agent shall take reasonable measures to protect and secure data in electronic form containing personal information.

While FIPA mandates the implementation of “reasonable measures” to protect personal information, it does not provide a specific definition, leaving room for interpretation. However, guidance can be drawn from various sources:

  •    Regulatory Guidance: For businesses that are more heavily regulated, such as healthcare entities, they can looked to federal and state frameworks that apply to them, such as the Health Insurance Portability and Accountability Act (HIPAA). Entities in the financial sector may be subject to both federal regulations, like the Gramm-Leach-Bliley Act, and state-imposed data protection requirements. The Florida Attorney General’s office may offer insights or recommendations on what constitutes reasonable measures. Here is one example, albeit not comprehensive.
  •   Standards in Other States: Several other states have outlined more specific requirements for protecting personal information. Examples include New York and Massachusetts

Best Practices for Implementing Reasonable Safeguards

Very often, various data security frameworks have several overlapping provisions. With that in mind, covered businesses might consider the following nonexhaustive list of best practices toward FIPA compliance. Many of the items on this list will seem obvious, even basic. But in many cases, these measures either simply have not been implemented or are not covered in written policies and procedures.

  • Conduct Regular Risk Assessments: Identify and evaluate potential vulnerabilities within your information systems to address emerging threats proactively.
  • Implement Access Controls: Restrict access to personal information to authorized personnel only, ensuring that employees have access solely to the data necessary for their roles.
  • Encrypt Sensitive Data: Utilize strong encryption methods for personal information both at rest and during transmission to prevent unauthorized access.
  • Develop and Enforce Written Data Security Policies, and Create Awareness: Establish comprehensive data protection policies and maintain them in writing. Once completed, information about relevant policies and procedures need to shared with employees, along with creating awareness about the changing risk landscape.
  • Maintain and Practice Incident Response Plans: Prepare and regularly update a response plan to address potential data breaches promptly and effectively, minimizing potential damages. Letting this plan sit on the shelf will have minimal impact on preparedness when facing a real data breach. It is critical to conduct tabletop and similar exercises with key members of leadership.
  • Regularly Update and Patch Systems: Keep all software and systems current with the latest security patches to protect against known vulnerabilities.

By diligently implementing these practices, entities can better protect personal information, comply with Florida’s legal requirements, and minimize risk.

Businesses that track the geolocation of individuals—whether for fleet management, sales and promotion, logistics, risk mitigation, or other reasons—should closely monitor the progress of California Assembly Bill 1355 (AB 1355), also known as the California Location Privacy Act. If passed, this bill would impose significant restrictions on the collection and use of geolocation data, requiring many businesses to overhaul their location tracking policies and procedures.

California has long been at the forefront of data privacy regulation, particularly in the area of location tracking. Section 637.7 of the California Penal Code, for example, provides that no person or entity in California may use an electronic tracking device to determine the location or movement of a person. Notably the law does not apply when the registered owner, lessor, or lessee of a vehicle has consented to the use such a device with respect to that vehicle.

More recently, the California Consumer Privacy Act of 2018 (CCPA) established a comprehensive privacy and security framework for personal information of California consumers, which includes granting consumers rights over their personal information. Under the CCPA, consumers have the right, subject to some exceptions, to limit the use of their “sensitive personal information,” a defined term which includes geolocation data. The California Privacy Rights Act of 2020 (CPRA) amended the CCPA, further strengthening these protections by enhancing consumer rights and enforcement mechanisms.

Importantly, employees and contractors are considered “consumers” under the CCPA.

Key Provisions of AB 1355

If enacted, AB 1355 would place strict limits on how businesses collect, use, and retain location information. Here are the major takeaways for businesses that track geolocation data.

Who Does the Law Apply To?  The law would apply to any business (referred to as a “covered entity”) that collects or uses location data from individuals in California, although there is an exception for the location information of patients if the information is protected by HIPAA or similar laws. Government agencies are not considered covered entities but are prohibited from monetizing location information.

The bill defines “individual” as a “natural person located within the State of California.” So, it looks like the individual need not be a California resident. In addition, the collection or use of location data must be necessary to provide goods or services requested by that individual. It is unclear how this provision would apply in the employment context.

Express Opt-In Requirement. Individuals would be required to expressly opt in before their location data could be collected; businesses would not be permitted to infer consent or use pre-checked boxes.

Prohibited Actions. Businesses would not be permitted to:

  • Collect more precise location data than is necessary.
  • Retain location data longer than necessary.
  • Sell, rent, trade, or lease location data to third parties.
  • Infer additional data from collected location information beyond what is necessary.
  • Disclose location data to government agencies without a valid court order issued by a California court.

Notice and Policy Requirement. Under AB 1355, businesses would be required to provide clear, prominent notice at the point where location data is collected. The notice would need to include the name of the covered entity and service provider collecting the information, and a phone number and an internet website where the individual can obtain more information. Companies also would need to maintain a location privacy policy detailing, among other things:

  • What location data is collected.
  • The retention and deletion policies.
  • Whether the data is used for targeted advertising.
  • The identities of third parties or service providers with access to the data.

Any changes to this policy would require at least 20 days’ notice and renewed consent.

Enforcement and Legal Remedies. If enacted, AB 1355 would permit the California Attorney General, district attorneys, and other public prosecutors to bring lawsuits against non-compliant businesses. Remedies could include all of the following:

  • Actual damages suffered by affected individuals.
  • A civil penalty of $25,000.
  • Court-ordered injunctions and attorney’s fees for prevailing plaintiffs.

Implications for Businesses Engaged in Location Tracking

This bill represents a major shift in how businesses must approach location tracking. If enacted, businesses relying on geolocation data for purposes such as monitoring employees, connecting with customers, improving logistics, or managing risk must:

  • Implement new opt-in procedures before collecting location data.
  • Reevaluate their data retention policies to ensure compliance.
  • Review agreements with third-party vendors that process location data.
  • Update their privacy policies and internal procedures to align with the new legal requirements.

In addition to monitoring the path of this legislation, businesses also should consider revisiting their current electronic monitoring and tracking activities. Data privacy and security laws have expanded in recent years, with geolocation data being one of the more sensitive categories of personal information protected.

Employee security awareness training is a best practice and a “reasonable safeguard” for protecting the privacy and security of an organization’s sensitive data.  The list of data privacy and cybersecurity laws mandating employee data protection training continues to grow and now includes the EU AI Act.  The following list is a high-level sample of employee training obligations. 

EU AI Act. Effective February 2, 2025, Article 4 of the Act requires that all providers and deployers of AI models or systems must ensure their workforce is “AI literate”.  This means training workforce members to achieve a sufficient level of AI literacy considering various factors such as the intended use of the AI system. Training should incorporate privacy and security awareness given the potential risks. Notably, the Act applies broadly and has extraterritorial reach. As a result, this training obligation may apply to organizations including but not limited to:

  • providers placing on the market or putting into service AI systems or placing on the market general-purpose AI models in the Union, irrespective of whether those providers are established or located within the Union or in a third country (e.g., U.S.);
  • deployers of AI systems that have their place of establishment or are located within the Union; and
  • providers and deployers of AI systems that have their place of establishment or are located in a third country (e.g., U.S.), where the output produced by the AI system is used in the Union.

California Consumer Privacy Act, as amended (CCPA). Cal. Code Regs. Tit. 11 sec. 7100 requires that all individuals responsible for the business’s compliance with the CCPA, or involved in handling consumer inquiries about the business’s information practices, must be informed of all of the requirements in the CCPA including how to direct consumers to exercise their rights under the CCPA. Under the CCPA, “consumer” means a California resident and includes employees, job applicants and individuals whose personal data is collected in the business to business context.

HIPAA. Under HIPAA, a covered entity or business associate must provide HIPAA privacy training as well as security awareness training to all workforce members. Note that this training requirement may apply to employers in their role as a plan sponsor of a self-insured health plan.

Massachusetts WISP law (201 CMR 17.03 201). Organizations that own or license personal information about a resident of the Commonwealth are subject to a duty to protect that information. This duty includes implementing a written information security program that addresses ongoing employee training. 

23 NYCRR 500. The New York Department of Financial Services’ cybersecurity requirement for financial services companies requires that covered entities provide cybersecurity personnel with cybersecurity updates and sufficient training to address relevant cybersecurity risks. 

Gramm-Leach-Bliley Act and the Safeguards Rule. The Safeguards Rule requires covered financial institutions to implement a written information security program to safeguard non-public information. The program must include employee security awareness training. In 2023, the FTC expanded the definition of financial institutions to include additional industries such as automotive dealerships and retailers that process financial transactions. 

EU General Data Protection Regulation (“EU GDPR”). Under Art. 39 of the EU GDPR, the tasks of a Data Protection Officer include training staff involved in the organization’s data processing activities.

In addition to the above, there are express or implied security awareness training obligations in numerous other laws and regulations including certain Department of Homeland Security contractors, licensees under state insurance laws modelled on the NAIC Insurance Data Security Model Law, and organizations that process payments via credit cards in accordance with PCI DSS.

Whether mandated by law or implemented as a best practice, ongoing employee privacy and security training plays a key role in safeguarding an organization’s sensitive data. Responsibility for protecting data is no longer the sole province of IT professionals. All workforce members with access to the organization’s sensitive data and information systems share that responsibility. And various stakeholders, including HR professionals, play a vital role in supporting that training.  

For more information on developing employee training check out our prior posts.

A California federal district court recently granted class certification in a lawsuit against a financial services company.  The case involves allegations that the company’s website used third-party technology to track users’ activities without their consent, violating the California Invasion of Privacy Act (CIPA). Specifically, the plaintiffs allege that the company along with its third-party marketing software platform, intercepted and recorded visitors’ interactions with the website, creating “session replays” which are effectively video recordings of the users’ real-time interaction with the website forms. The technology at issue in the suit is routinely utilized by website operators to provide a record of a user’s interactions with a website, in particular web forms and marketing consents. 

The plaintiffs sought class certification for individuals who visited the company’s website, provided personal information, and for whom a certificate associated with their website visit was generated within a roughly year time frame. The company argued that users’ consent must be determined on an individual and not class-wide, basis.  The company asserted that implied consent could have come from multiple different sources including its privacy policies and third-party materials provided notice of data interception and thus should be viewed as consent. Some of the sources the company pointed to as notice included third-party articles on the issue.

The district court found those arguments insufficient and held that common questions of law and fact predominated as to all users. Specifically, the court found whether any of the sources provided notice of the challenged conduct in the first place to be a common issue. Further, the court found that it could later refine the class definition to the extent a user might have viewed a particular source that provided sufficient notice. The court also determined plaintiffs would be able to identify class members utilizing the company’s database, including cross-referencing contact and location information provided by users.

While class certification is not a decision on the merits and it is not determinative whether the company failed to provide notice or otherwise violated CIPA, it is a significant step in the litigation process. If certification is denied, the potential damages and settlement value are significantly lower.  However, if plaintiffs make it over the class certification hurdle, the potential damages and settlement value of the case increase substantially.

This case is a reminder to businesses to review their current website practices and implement updates or changes to address issues such as notice (regarding tracking technologies in use) and consent (whether express or implied) before collecting user data. It is also important when using third-party tracking technologies, to audit if vendors comply with privacy laws and have data protection measures in place.

If you have questions about website tracking technology and privacy compliance, contact a Jackson Lewis attorney to discuss.