The Oklahoma State Legislature recently enacted Senate Bill 626, amending its Security Breach Notification Act, effective January 1, 2026, to address gaps in the state’s current cybersecurity framework (the “Amendment”).  The Amendment includes new definitions, mandates reporting to the state Attorney General, clarifies compliance with similar laws, and provides revised penalty provisions, including affirmative defenses.

Definitions

The Amendment provides clearer definitions related to security breaches, specifying what constitutes “personal information” and “reasonable safeguards.”

  • Personal Information:  The existing definition for “Personal Information” was expanded to also include (1) a unique electronic identifier or routing code in combination with any required security code, access code, or password that would permit access to an individual’s financial account and (2) unique biometric data such as a fingerprint, retina or iris image, or other unique physical or digital representation of biometric data to authenticate a specific individual.
  • Reasonable Safeguards:  The Amendment provides an affirmative defense in a civil action under the law for individuals or entities that have “Reasonable safeguards” in place, which are defined as “policies and practices that ensure personal information is secure, taking into consideration an entity’s size and the type and amount of personal information. The term includes, but is not limited to, conducting risk assessments, implementing technical and physical layered defenses, employee training on handling personal information, and establishing an incident response plan”.

Mandated Reporting and Exceptions

In the new year, entities required to provide notice to impacted individuals under the law in case of a breach will also be required to notify the Attorney General. The notification must include specific details including, but not limited to, the type of personal information impacted the nature of the breach, the number of impacted individuals, the estimated monetary impact of the breach to the extent such can be determined, and any reasonable safeguards the entity employs. The notification to the Attorney General must occur no more than 60 days after notifying affected residents.

However, breaches affecting fewer than 500 residents, or fewer than 1,000 residents in the case of credit bureaus, are exempt from the requirement to notify the Attorney General.

In addition, an exception from individual notification is provided for entities that comply with notification requirements under the Oklahoma Hospital Cybersecurity Protection Act of 2023 or the Health Insurance Portability and Accountability Act of 1996 (HIPAA) if such entities provide the requisite notice to the Attorney General.

What Entities Should Do Now

  1. Inventory data.  Conduct an inventory to determine what personal information is collected given the newly covered data elements.
  • Review and update policies and practices.  Reevaluate and update current information security policies and procedures to ensure proper reasonable safeguards are in place.  Moreover, to ensure that an entity’s policies and procedures remain reasonably designed, they should be periodically reviewed and updated.

If you have any questions about the revisions to Oklahoma’s Security Breach Notification Act or related issues, contact a Jackson Lewis attorney to discuss.

“Our cars know how fast you’re driving, where you’re going, how long you stay there. They know where we work, they know whether we stop for a drink on the way home, whether we worship on the weekends, and what we do on our lunch hours.” OR Representative David Gomberg

The Oregon Legislature recently enacted House Bill 3875, amending the Oregon Consumer Privacy Act (OCPA) effective September 28. 2025, to broaden its scope to include motor vehicle manufacturers and their affiliates that control or process personal data from a consumer’s use of a vehicle or its components.

While this expansion is clear in its application to vehicle manufacturers, it raises important questions for automobile dealerships, particularly those “affiliated”—formally or informally—with manufacturers. Dealerships should consider whether they may now be subject to the full scope of Oregon’s privacy law. Of course, they may be subject directly to the OCPA in their own right.

The Amendment: HB 3875

HB 3875 modifies ORS 646A.572 to extend the OCPA’s privacy obligations to:

“A motor vehicle manufacturer or an affiliate of the motor vehicle manufacturer that controls or processes personal data obtained from a consumer’s use of a motor vehicle or a vehicle’s technologies or components.”

Who Counts as an “Affiliate”?

To determine whether a dealership is subject to these new obligations, one must examine the OCPA’s definition of affiliate:

“Affiliate” means a person that, directly or indirectly through one or more intermediaries, controls, is controlled by or is under common control with another person such that:

      (a) The person owns or has the power to vote more than 50 percent of the outstanding shares of any voting class of the other person’s securities;

      (b) The person has the power to elect or influence the election of a majority of the directors, members or managers of the other person;

      (c) The person has the power to direct the management of another person; or

      (d) The person is subject to another person’s exercise of the powers described in paragraph (a), (b) or (c) of this subsection.

This definition introduces some ambiguity for dealerships. Many dealerships operate as independent businesses, even if they sell only one manufacturer’s vehicles and display that brand prominently. While they may be contractually tied to a manufacturer, they may not meet the legal standard of being controlled by or under common control with that manufacturer as described in the definition.

However, certain dealership groups—particularly those owned or operated by manufacturers or holding companies—may clearly fall within the definition of “affiliate.”

Dealerships should evaluate their corporate structure and agreements with manufacturers to determine whether this definition might apply to them.

Why This Matters

Entities subject to the OCPA must comply with a range of privacy requirements, including:

  • Providing transparent privacy notices
  • Obtaining consumer consent for data collection and sharing under certain circumstances
  • Offering consumer rights such as access, correction, deletion, and data portability
  • Implementing reasonable data security measures

These obligations extend to any personal data collected through vehicle technologies, such as navigation systems, driver behavior analytics, location data, and mobile app integrations.

Federal Context: FTC Enforcement

Dealerships should also remain aware of federal obligations. Under the Gramm-Leach-Bliley Act (GLBA), auto dealers engaged in leasing or financing must follow privacy and safeguard rules enforced by the Federal Trade Commission (FTC).

The FTC has published detailed guidance for auto dealers, including:

What Dealerships Should Do Now

Even if a dealership is not legally an “affiliate” under the OCPA or subject to a similar state comprehensive privacy law,  the trend toward regulating vehicle-generated data suggests it’s time to proactively review data practices. Dealerships should:

  1. Conduct a data inventory to identify what personal data is collected, especially from connected vehicle systems.
  2. Update privacy notices and practices in accordance with state and federal law.
  3. Review contracts with manufacturers and vendors for data-sharing provisions and compliance obligations.
  4. Train staff on new privacy responsibilities and how to respond to consumer data requests.

California lawmakers have proposed new legislation to reshape the growing use of artificial intelligence (AI) in the workplace. While this bill aims to protect workers, employers have expressed concerns about how it might affect business efficiency and innovation.

What Does California’s Senate Bill 7 (SB 7) Propose?

SB 7, also known as the “No Robo Bosses Act,” introduces several key requirements and provisions restricting how employers use automated decision systems (ADS) powered by AI. These systems are used in making employment-related decisions, including hiring, promotions, evaluations, and terminations. The pending bill seeks to ensure that employers use these systems responsibly and that AI only assists in decision-making rather than replacing human judgment entirely.

The bill is significant for its privacy, transparency, and workplace safety implications, areas that are fundamental as technology becomes more integrated into our daily work lives.

Privacy and Transparency Protections

SB 7 includes measures to safeguard worker privacy and ensure that personal data is not misused or mishandled. The bill prohibits the use of ADS to infer or collect sensitive personal information, such as immigration status, religious or political beliefs, health data, sexual or gender orientation, or other statuses protected by law. These limitations could significantly limit an employer’s ability to use ADS to streamline human resources administration, even if the ADS only assists but does not replace human decision making. Notably, the California Consumer Privacy Act, which treats applicants and employees of covered businesses as consumers, permits the collection of such information.

Additionally, if the bill is enacted, employers and vendors will have to provide written notice to workers if an ADS is used to make employment-related decisions that affect them. The notice must provide a clear explanation of the data being collected and its intended use. Affected workers also must receive a notice after an employment decision is made with ADS. This focus on transparency aims to ensure that workers are aware of how their data is being used.

Workplace Safety

Beyond privacy, SB 7 also highlights workplace safety by prohibiting the use of ADS that could violate labor laws or occupational health and safety standards. Employers would need to make certain that ADS follow existing safety regulations, and that this technology does not compromise workplace health and safety. Additionally, ADS restrictions imposed by this pending bill could affect employers’ ability to proactively address or monitor potential safety risks with the use of AI.

Oversight & Enforcement

SB 7 prohibits employers from relying primarily on an ADS for significant employment-related decisions, such as hiring and discipline, and requires human involvement in the process. The bill grants workers the right to access and correct their data used by ADS, and they can appeal ADS employment-related decisions. A human reviewer must also evaluate the appeal. Employers cannot discriminate or retaliate against a worker for exercising their rights under this law.

The Labor Commissioner would be responsible for enforcing the bill, and workers may bring civil actions for alleged violations. Employers may face civil penalties for non-compliance.

What’s Next?

While SB 7 attempts to keep pace with the evolution of AI in the workplace, there will likely be ongoing debate about these proposed standards and which provisions will ultimately become law. Jackson Lewis will continue to monitor the status of SB 7.

If you have questions about California’s pending legislation and how it could affect your organization, contact a Jackson Lewis attorney to discuss.

A recent series of articles by the International Association of Privacy Professionals discusses a trend in privacy litigation focused on breach of contract and breach of warranty claims.  

Practical Takeaways

  • Courts are increasingly looking at website privacy policies, terms of use, privacy notices, and other statements from organizations and assessing breach of contract and warranty claims when individuals allege businesses failed to uphold their stated (or unstated) data protection promises (or obligations).
  • To avoid such claims, businesses should review their data privacy and security policies and public statements to ensure they accurately reflect their data protection practices, invest in robust security measures, and conduct regular audits to maintain compliance.

Privacy policies are no longer just formalities; they can become binding commitments. Courts are scrutinizing these communications to determine whether businesses are upholding their promises regarding data protection. Any discrepancies between stated policies and actual practices can lead to breach of contract claims. In some cases, similar obligations can be implied through behavior or other circumstances and create a contract.

There are several ways these types of claims arise. The following outlines the concepts that plaintiffs are asserting:

  • Breach of Express Contract: These claims arise when a plaintiff alleges a business failed to adhere to the specific terms outlined in their privacy policies. For example, if a company promises to “never” share user data with third parties, but does so.
  • Breach of Implied Contract: Even in the absence of explicit terms, businesses can face claims based on implied contracts. This occurs when there is an expectation of privacy and/or security based on the nature of the relationship between the business and its customers.
  • Breach of Express Warranty: Companies that make specific assurances about the security and confidentiality of user data can be held liable if they fail to meet these assurances.
  • Breach of Implied Warranty: These claims are based on the expectation that a company’s data protection measures will meet certain standards of quality and reliability.

How to avoid being a target:

  1. Ensure Accuracy in Privacy Policies, Notices, Terms: Even if a business takes the steps described below and others to strengthen its data privacy and security safeguards, those efforts still may be insufficient to support strong statements concerning such safeguards made in policies, notices, and terms. Accordingly, businesses should carefully review and scrutinize their privacy policies, notices, terms, and conditions for collecting, processing, and safeguarding personal information. This effort should involve the drafters of those communications working with IT, legal, marketing, and other departments to ensure the communications are clear, accurate, and reflective of their actual data protection practices.
  2. Assess Privacy and Security Expectations and Obligations. As noted above, breach of contract claims may not always arise from express contract terms. Businesses should be aware of circumstances that might suggest an agreement with customers concerning their personal information, and then work to address the contours of that promise.
  3. Strengthen Data Privacy and Security Protections. A business may be comfortable with its public privacy policies and notices, feel that it has satisfied implied obligations, but still face breach of contract or warranty claims. In that case, having a mature and documented data privacy and security program can go a long way toward strengthening the business’s defensible position. Such a program includes adopting comprehensive privacy and security practices and regularly updating them to address new threats. At a minimum, the program should comply with applicable regulatory obligations, as well as industry guidelines. The business should regularly review the program, its practices, changes in service, etc., as well as publicly stated policies and notices, as well as customer agreements, to ensure that data protection measures align with stated policies.

On March 10, 2025, California Attorney General Rob Bonta announced an investigative sweep targeting the location data industry, emphasizing compliance with the California Consumer Privacy Act (CCPA). This announcement follows the California legislature proposing a bill that, if passed, would impose restrictions on the collection and use of geolocation data.

Of course, concerns about geolocation tracking are not limited to California.

In California, the Attorney General’s investigation involved sending letters to advertising networks, mobile app providers, and data brokers that appear to the Attorney General to be in violation of the CCPA. These letters notify recipients of potential violations and request additional information regarding their business practices. The focus is on how businesses offer and effectuate consumers’ rights to stop the sale and sharing of personal information and to limit the use of sensitive personal information, including geolocation data.

To avoid enforcement actions, businesses in the location data industry must ensure compliance with the CCPA.

  1. Understand Consumer Rights: The CCPA grants California consumers several rights, including the right to know what personal information is being collected, the right to opt out of the sale or sharing of their personal information, and the right to delete their personal information. Because precise geolocation data is “sensitive personal information” under the CCPA, consumer rights also include the right to limit the use or disclosure of such information. Businesses must clearly communicate these rights to consumers (which includes employees).
  2. Implement Opt-Out/Limitation Mechanisms: As noted, businesses must provide consumers with the ability to opt out of the sale and sharing of their personal information, and to limit the use or disclosure of sensitive personal information. This includes implementing clear and accessible opt-out/limitation request mechanisms on websites and mobile apps. Once a consumer opts out, businesses cannot sell or share their personal information unless they receive authorization to do so again.
  3. Transparency and Accountability: Businesses must be transparent about their data collection and disclosure practices. This includes providing detailed privacy policies that explain what data is collected, how it is used, and the categories of third parties to whom it is disclosed. Additionally, businesses should be prepared to respond to inquiries from the Attorney General’s office and provide documentation of their compliance efforts.

If you have questions about the current California investigation into geolocation or need assistance in ensuring compliance with the CCPA, contact a Jackson Lewis attorney to discuss.

In 2024, Colorado passedthe first comprehensive state-level law in the U.S. regulating the use of artificial intelligence, the Artificial Intelligence Act (the Act). It imposed strict requirements on developers and users of “high-risk” AI systems, particularly in sectors like employment, housing, finance, and healthcare. The Act drew criticism for its complexity, breadth, and potential to stifle innovation.

In early 2025, lawmakers introduced Senate Bill (SB) 25-318 as a response to growing concerns from the tech industry, employers, and even Governor Jared Polis, who reluctantly signed the Act into law last year.

SB25-318 aimed to soften and clarify some of the more burdensome aspects of the original legislation before its compliance deadline of February 1, 2026.

Amendments proposed under SB 25-318 included:

  • An exception to the definition of “developer” if the person offers an AI system with open model weights and meets specified conditions.
  • Exemptions for specified technologies.
  • Elimination of the duty of a developer or deployer to use reasonable care to protect consumers from known or reasonably foreseeable risks of algorithmic discrimination and the requirement to notify the state attorney general of such risk.
  • An exemption from specified disclosure requirements for developers if they meet certain financial and operational criteria.

Despite its intention to strike a balance between innovation and regulation, SB25-318 was voted down 5-2 by the Senate Business, Labor, and Technology Committee on May 5, 2025.

With SB25-318 dead, the original Act remains intact, and the next step is for the Colorado Attorney General to issue rules and/or guidance.  As it now stands, businesses and developers operating in Colorado must prepare for full compliance by early 2026 unless this date is otherwise extended.    

If you have questions about compliance with Colorado’s Artificial Intelligence Act or related issues, contact a Jackson Lewis attorney to discuss.

Online retailer Harriet Carter Gifts recently obtained summary judgment from the district court in a class action under Pennsylvania wiretap law. At the heart of this case is the interpretation and application of the Pennsylvania Wiretapping and Electronic Surveillance Control Act of 1978 (WESCA), a statute designed to regulate the interception of electronic communications. The court’s primary task was to determine whether the actions of Harriet Carter Gifts and NaviStone constituted an unlawful interception under this law.

In 2021, the district court sided with the defendants, granting summary judgment because NaviStone was a direct party to the communications, and thus, no interception occurred under WESCA. However, the Third Circuit Court of Appeals overturned this decision. The appellate court clarified that there is no broad direct-party exception to civil liability under WESCA. Consequently, the case was remanded to determine “whether there is a genuine issue of material fact about where the interception occurred.”

On remand, the district court examined whether Popa could be deemed to have consented to the interception of her data by NaviStone through the privacy policy posted on Harriet Carter’s website. The court focused on whether the privacy policy was sufficiently conspicuous to provide constructive notice to Popa.

The enforceability of browsewrap agreements, which are terms and conditions posted on a website without requiring explicit user consent, was another critical aspect of the case. The court found that Harriet Carter’s privacy policy was reasonably conspicuous and aligned with industry standards. The court noted that the privacy policy was linked in the footer of every page on the Harriet Carter website, labeled “Privacy Statement,” and was in white font against a blue background. This placement was consistent with common industry practices in 2018 when the violation was alleged, which typically involved placing privacy policies in the footer of websites.

This led the court to conclude that Popa had constructive notice of the terms, reinforcing the notion of implicit consent. Notably, the court found implicit consent without any evidence that Popa had actual knowledge of the terms of the privacy statement.  Rather, the court found a reasonably prudent person would be on notice of the privacy statement’s terms. 

Based on these findings, the court granted summary judgment in favor of the defendants. The court determined that Popa’s WESCA claim failed because she had implicitly consented to the interception by NaviStone, as outlined in Harriet Carter’s privacy statement. 

The case of Popa vs. Harriet Carter Gifts, Inc. and NaviStone, Inc. emphasizes the necessity for clear and accessible privacy policies in the digital era. It also brings attention to the complex legal issues related to user consent and the interception of electronic communications. For inquiries regarding compliance with consumer consent requirements for websites, please contact a Jackson Lewis attorney to discuss further.

In late March 2025, the Florida Bar Board of Governors unanimously endorsed the recommendation of its Special Committee on Cybersecurity and Privacy Law that law firms should adopt written incident response plans (IRPs) to better prepare for and respond to data security incidents. The recommendation reflects a growing recognition across professional service industries—particularly law firms—of the serious risks posed by cyber threats and the need for structured, proactive responses.

The message is simple: law firms must be prepared.

As most practitioners will observe, it is not a matter of if an organization will experience a data breach, but when. Development and implementation of an IRP can be challenging as the nature of legal practice poses unique challenges, even for smaller firms. At the same time, as stewards of vast amounts of highly sensitive client and employee data, often spanning multiple industries, jurisdictions, and confidentiality regimes, such data sets make law firms attractive targets for threat actors, especially those seeking access to intellectual property, litigation strategies, and regulatory or financial information, not to mention sensitive personal information.

What Makes Law Firms Different?

Unlike organizations in several other industries, law firms often lack centralized compliance infrastructures or in-house technical expertise. Client confidentiality obligations and the attorney-client privilege can complicate both the detection and disclosure of incidents. In some cases, firm may confuse confidentiality for security, when both are needed.

In addition, unlike most other professional service providers, law firms grapple with a set of comprehensive rules of professional responsibility that increasingly delve into data privacy and cybersecurity issues. Of course, those rules sit on top of generally applicable business regulation that law firms also face. See, for example, our recent discussion about the Florida Information Protection Act (FIPA) which mandates that certain entities, including law firms, implement reasonable measures to protect electronic data containing personal information.

When engaging a new client, a simple engagement letter may no longer be sufficient, especially for law firms representing certain businesses, particularly those that are heavily regulated. Consider law firms that defend medical malpractice claims. Their clients are most likely healthcare providers covered by the privacy and security regulations under the Health Insurance Portability and Accountability Act (HIPAA). That makes these firms “business associates” to the extent those services involve access to “protected health information.” Just like their healthcare provider clients, business associate law firms are required to maintain an incident response plan. 45 CFR 164.308(a)(6). So, even before Recommendation 25-1, many law firms may have already been obligated to maintain an IRP, at least with respect to certain information collected from or on behalf of certain clients.

Given these realities for law firms, the Florida Bar’s recommendation is both timely and necessary, even if not unprecedented. Notably, in 2018, the ABA issued Formal Opinion 483 which made a similar recommendation. Law firms considering an IRP should consult Formal Opinion 483.

What Should a Law Firm’s Incident Response Plan Include?

A comprehensive and tailored IRP should be risk-based and scalable to firm size, practice areas, and existing infrastructure. Here are some components all firms should consider including in their IRP:

  1. Governance and Roles
    Define the internal response team and assign roles, including legal, IT, communications, HR, and leadership. Identify outside partners, such as breach counsel, forensics, and public relations firms.
  2. Data Mapping and Risk Assessment
    Map the data your firm collects, stores, and shares. Understand where sensitive client and employee information resides and how it is secured. A risk assessment will help prioritize which systems and data are most critical.
  3. Incident Detection and Reporting
    Establish processes for identifying, reporting, and escalating suspected incidents. Time is critical when responding to ransomware, business email compromise, or other attacks. Remember to have a plan to communication outside of the firm’s existing systems which may not be operable.
  4. Investigation and Containment
    Outline steps to contain and investigate an incident, including coordination with law enforcement, insurance carriers, forensic investigators, and legal advisors.
  5. Notification and Legal Obligations
    Address client communications, breach notification laws, ethical duties, and contractual terms that may require specific responses.
  6. Post-Incident Review and Testing
    After resolving an incident, assess what went well and what needs improvement. Regular tabletop exercises and plan updates are essential.

Additional Tools and Resources

With cyber threats evolving and legal obligations expanding, law firms must treat incident response planning as an ethical, professional, and business imperative. The Florida Bar’s recommendation should serve as a wake-up call. By building a strong IRP, law firms can better protect client confidences, meet regulatory requirements, and preserve their professional reputation.

On March 24, 2025, Virginia’s Governor vetoed House Bill (HB) 2094, known as the High-Risk Artificial Intelligence Developer and Deployer Act. This bill aimed to establish a regulatory framework for businesses developing or using “high-risk” AI systems.

The Governor’s veto message emphasized concerns that HB 2094’s stringent requirements would stifle innovation and economic growth, particularly for startups and small businesses. The bill would have imposed nearly $30 million in compliance costs on AI developers, a burden that could deter new businesses from investing in Virginia. The Governor argued that the bill’s rigid framework failed to account for the rapidly evolving nature of the AI industry and placed an onerous burden on smaller firms lacking large legal compliance departments.

The veto of HB 2094 in Virginia reflects a broader debate in AI legislation across the United States. As AI technology continues to advance, both federal and state governments are grappling with how to regulate its use effectively.

At the federal level, AI legislation has been marked by contrasting approaches between administrations. Former President Biden’s Executive Orders focused on ethical AI use and risk management, but many of these efforts were revoked by President Trump this year. Trump’s new Executive Order, titled “Removing Barriers to American Leadership in Artificial Intelligence,” aims to foster AI innovation by reducing regulatory constraints.

State governments are increasingly taking the lead in AI regulation. States like Colorado, Illinois, and California have introduced comprehensive AI governance laws. The Colorado AI Act of 2024, for example, uses a risk-based approach to regulate high-risk AI systems, emphasizing transparency and risk mitigation. While changes to the Colorado law are expected before its 2026 effective date, it may emerge as a prototype for others states to follow. 

Takeaways for Business Owners

  1. Stay Informed: Keep abreast of both federal and state-level AI legislation. Understanding the regulatory landscape will help businesses anticipate and adapt to new requirements.
  2. Proactive Compliance: Develop robust AI governance frameworks to ensure compliance with existing and future regulations. This includes conducting risk assessments, implementing transparency measures, and maintaining proper documentation.
  3. Innovate Responsibly: While fostering innovation is crucial, businesses must also prioritize ethical AI practices. This includes preventing algorithmic discrimination and ensuring the responsible use of AI in decision-making processes.

If you have questions about compliance with AI regulation, contact a member of our Artificial Intelligence Group or the Jackson Lewis attorney with whom you regularly work.

On Friday, the U.S. Department of Health and Human Services (HHS), Office for Civil Rights (OCR) announced the fifth enforcement action under its Risk Analysis Initiative. In this case, OCR reached a settlement with Health Fitness Corporation (Health Fitness), a wellness vendor providing services to employer-sponsored group health plans.

This announcement is interesting for several reasons. It furthers the OCR’s Risk Analysis Initiative. The enforcement action is a reminder to business associates about HIPAA compliance. The development also points to a significant development under ERISA for plan fiduciaries and service providers to their plans.

The OCR Risk Analysis Initiative

Anyone who takes a look at prior OCR enforcement actions will notice several trends. One of those trends relates to enforcement actions following a data breach. In those cases, the OCR frequently alleges the target of the action failed to satisfy the risk analysis standard under the Security Rule. This standard is fundamental – it involves assessing the threats and vulnerabilities to electronic protected health information (ePHI), a process that helps to shape the covered entity or business associate’s approach to the other standards, and goes beyond a simply gap analysis.

“Conducting an accurate and thorough risk analysis is not only required but is also the first step to prevent or mitigate breaches of electronic protected health information,” said OCR Acting Director Anthony Archeval.  “Effective cybersecurity includes knowing who has access to electronic health information and ensuring that it is secure.”

For those wondering how committed the OCR is to its enforcement initiatives, you need not look further than its Right to Access Initiative. On March 6, 2025, the agency announced its 53rd enforcement action. According to that announcement, it involved a $200,000 civil monetary penalty imposed against a public academic health center and research university for violating an individual’s right to timely access her medical records through a personal representative.

The DOL Cybersecurity Rule

Businesses that sponsor a group health plan or other ERISA employee benefit plans might want to review the OCR’s announcement and resolution agreement concerning Health Fitness a little more carefully. In 2024, the DOL’s Employee Benefits Security Administration (EBSA) issued Compliance Assistance Release No. 2024-01. That release makes clear that the fiduciary obligation to assess the cybersecurity of plan service providers applies to all ERISA-covered employee benefit plans, including wellness programs for group health plans.

OCR commenced it investigation of Health Fitness after receiving four reports from Health Fitness, over a three-month period (October 15, 2018, to January 25, 2019), of breaches of PHI.  According to the OCR, “Health Fitness reported that beginning approximately in August 2015, ePHI became discoverable on the internet and was exposed to automated search devices (web crawlers) resulting from a software misconfiguration on the server housing the ePHI.” Despite these breaches, according to the OCR, Health Fitness had failed to conduct an accurate and thorough risk analysis, until January 19, 2024.

For Health Fitness, it agreed to implement a corrective action plan that OCR will monitor for two years and paid $227,816 to OCR. For ERISA plan fiduciaries, an important question is what they need to do to assess the cybersecurity of plan service providers like Health Fitness during the procurement process and beyond.

We provide some thoughts in our earlier article and want to emphasize that plan fiduciaries need to be involved in the process. Cybersecurity is often a risk left to the IT department.  However, doing so may leave even the most ardent IT professional ill equipped or insufficiently informed about the threats and vulnerabilities of the particular service provider. When it come to ERISA plans, this means properly assessing the threats and vulnerabilities as they relate to the aspects of plan administration being handled by the service provider.

Third-party plan service providers and plan fiduciaries should begin taking reasonable and prudent steps to implement safeguards that will adequately protect plan data. EBSA’s guidance should help the responsible parties get there, along with the plan fiduciaries and plan sponsors’ trusted counsel and other advisors.