Maryland’s governor recently signed the Maryland Online Data Privacy Act of 2024 (MODPA), making Maryland one of six states—along with Kentucky, Nebraska, New Hampshire, New Jersey, and Rhode Island—to pass a comprehensive privacy law this year.  Overall, 19 states (and counting) now have such laws on their books.  

Maryland’s law takes effect October 1, 2025.

To whom does the law apply?

MODPA applies to organizations that conduct business in Maryland, or provide products or services that are targeted to its residents, and that, during the preceding calendar year, did one of the following:

  • Controlled or processed the personal data of at least 35,000 state residents, excluding data or processing solely for the purposes of completing payment transactions, or
  • Controlled or processed the personal data of at least 10,000 state residents and derived more than 20 percent of their gross revenue from the sale of personal data.

MODPA excludes from its application financial institutions, along with data subject to other privacy frameworks, including Health Insurance Portability and Accountability Act (HIPAA) and Family Educational Rights and Privacy Act (FERPA).  Notably, MODPA does not exempt HIPAA-covered entities, institutions of higher learning, or nonprofits.  

Who is protected by the law?

Consumer means an individual who is a resident of the State of Maryland.  The definition of consumer does not include an individual acting in a commercial or employment context.

What data is protected by the law?

MODPA protects “personal data,” which it defines as any information that is linked or reasonably could be linked to an identified or identifiable individual.  The law excludes de-identified data and publicly available information.

What are the rights of consumers?

MODPA grants consumers the rights to:

  • Request confirmation of whether a controller is processing their personal data;
  • Request access to that data;
  • Request to correct it;
  • Request its deletion;
  • Obtain a list of the categories of third parties to which the controller has disclosed their data;
  • Opt out of the sale of their personal data, or use of that data for targeted advertising or profiling; and
  • Be free from discrimination for exercising their MODPA rights.

What obligations do controllers have?

MODPA requires that controllers:

  • Provide consumers with a reasonably accessible, clear, and meaningful privacy notice that discloses, among other things:
  • the categories of personal data processed by the controller, including sensitive data;
    • the controller’s purpose for processing personal data;
    • how a consumer may exercise rights under MODPA, including how a consumer may appeal a controller’s decision regarding the consumer’s request;
    • the categories of third parties with which the controller shares personal data;
    • the categories of personal data, including sensitive data, that the controller shares with third parties;
    • an email address or other online mechanism that a consumer may use to contact the controller; and
    • if applicable, a clear, conspicuous, and prominently displayed notice that (a) the controller sells personal data, or discloses it for targeted advertising or profiling, and (b) the consumer has the right to opt out of the disclosure of its data for those purposes.
  • Limit their collection of personal data to what is reasonably necessary and proportionate to provide or maintain a specific product or service requested by the consumer.
  • Conduct and document a data protection assessment for each processing activity that presents a heightened risk of harm to a consumer, including an assessment of each algorithm that is used.

Controllers are also prohibited from selling “sensitive data,” meaning data that reveals the consumers’ racial or ethnic origin, religious beliefs, health data, sex life, sexual orientation, status as transgender or nonbinary, national origin, or citizenship.

In addition to the prohibition on selling consumer health data, MODPA prohibits providing employees or contractors with access to such data unless the employee or contractor is subject to a contractual or statutory duty of confidentiality, or, in the case of an employee, confidentiality is required as a condition of employment.

How is the law enforced?

MODPA will be enforced by the state’s attorney general.  Though it does not establish a private right of action, it permits consumers to pursue remedies under other laws.

***

If you have questions about MODPA or related issues, please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.

Virtually all organizations have an obligation to safeguard their personal data against unauthorized access or use, and, in some instances, to notify affected individuals in the event such access or use occurs.  Those obligations are, in some instances, relatively nebulous, and organizations—for better or worse—have flexibility to determine what pre-incident safeguards and post-incident responsive actions are “reasonable” under the circumstances. 

The SEC, in its recent amendments to Regulation S-P (the Amendments), takes a different approach.  The Amendments impose detailed and specific obligations on covered institutions—including broker-dealers, investment companies, registered investment advisers, and transfer agents—to (1) develop and maintain written incident response programs and (2) provide notification to affected individuals in the event their sensitive customer information is subject to unauthorized access or use (a Data Breach)

Incident Response Program

The Amendments require covered institutions to develop and maintain written information response programs.  The function of these programs is to enable covered institutions to better detect and respond to Data Breaches, including by facilitating their:

  • assessment of the nature and scope of these incidents, including identification of the internal systems containing customer information and the types of customer information that may have been accessed or used without authorization.  The Amendments indicate that covered institutions when assessing an incident, should consider the type and extent of the unauthorized access, the impact on operations, and whether information has been exfiltrated or is no longer accessible;
  • containment and control of the incident to prevent further unauthorized access to or use of customer information.  The Amendments acknowledge that the appropriate steps for containing and controlling an incident will vary based on its nature, but identify the following as potential key action items: isolation of affected systems, enhancement of system monitoring, identifying additional compromised systems, forcing password resets, and changing or disabling default user accounts; and
  • notification to individuals whose “sensitive customer information” (defined below) was, or is reasonably likely to have been, accessed or used without authorization.

Notably, while the foregoing incident response program requirements apply to all consumer “nonpublic personal information”—a broad category encompassing all personally identifiable financial information a financial institution collects about an individual in connection with providing a financial product or service—the notification obligations discussed below are limited to incidents impacting “sensitive customer information.”

Notification to Affected Individuals

Covered institutions must provide notice to each affected individual whose sensitive customer information was, or is reasonably likely to have been, subject to a Data Breach.  “Sensitive customer information” includes:

  • information uniquely identified with an individual, such that it can reasonably be used to authenticate the individual’s identity;
  • government-issued identification numbers, including a social security number, driver’s license number, alien registration number, passport number, or employer or taxpayer identification number;
  • a biometric record;
  • a unique electronic identification number, address, or routing code;
  • telecommunication identifying information or access device; or
  • information identifying an individual or an individual’s account, including an account number, name, or online username, in combination with other authenticating information that could be used to gain access to an individual’s account.

In the event of a Data Breach, the Amendments require covered institutions to provide clear and conspicuous notice “as soon as practicable,” but not later than 30 days after their discovery of the breach.  Notice to affected individuals must include the following:

  • a general description of the incident and type of sensitive customer information affected;
  • the date (or estimated date/date range) of the incident;
  • contact information notice recipients can utilize to obtain more information about the incident; and
  • steps affected individuals can take to protect their information, including how they can obtain free credit reports, place fraud alerts on their accounts, and review their account statements for suspicious activity.

Under the Amendments, unauthorized access to or use of sensitive customer information does not always trigger the obligation to notify.  Notice is not required if, after a reasonable investigation of relevant facts and circumstances, the covered institution determines that the sensitive customer information in question has not been, and is not reasonably likely to be, used in a manner resulting in substantial harm or inconvenience (e.g. because it was protected by encryption).  The Amendments indicated that, if a covered institution reasonably determines that a specificindividual’s sensitive customer information was not accessed or used without authorization, it does not need to notify that individual.  However, if the covered institution is unable to identify which specific individual’s sensitive customer information has been accessed or used, it must notify all individuals whose information resided on the impacted information system.

Implementation

The Amendments will take effect in early August 2024, but covered entities—depending on their size—will have an 18- or 24-month grace period to come into compliance.  Larger entities, which are defined below, will need to come into compliance by December 2025, while smaller entities will have until June 2026.    

EntityQualification to be Considered a Larger Entity
Investment companies together with other investment companies in the same group of related investment companiesNet assets of $1 billion or more as of the end of the most recent fiscal year.
Registered investment advisers$1.5 billion or more in assets under management.
Broker-dealersAll broker-dealers that are not small entities under the Securities Exchange Act for purposes of the Regulatory Flexibility Act.
Transfer agentsAll transfer agents that are not small entities under the Securities Exchange Act for purposes of the Regulatory Flexibility Act.

Takeaways

Though the grace periods will likely lull some entities into near-term complacency—believing they have plenty of time to get their houses in order—prudent entities will place compliance with the Amendments high on their task lists. 

For entities that haven’t already made a significant investment in their incident response programs, development of the robust program the Amendments require will be a heavy lift.  Compliance with the assessment component, for instance, may require entities to conduct extensive data mapping to better understand what data they have, where it’s stored, how it’s safeguarded, and how long it’s retained. 

They may also need to take a close look at their current controls to detect and rapidly investigate and respond to potential Data Breaches, including those that enable the isolation of affected systems, the identification and eradication of ongoing malicious activity, and the restoration of business operations, including potential data recovery from backups. 

Covered entities will also need to prepare to analyze their notification obligations and timely provide requisite notices. 

To many, the above requirements will sound familiar, as they overlap to a degree with obligations imposed by state reasonable safeguard and breach notification laws.  The Amendments’ incident response plan prescriptions, however, are more detailed and onerous than the requirements imposed by most state laws, and their definition of “sensitive customer information” is broader than the definition of “personally identifiable information” (or the comparable term) in most states.  Accordingly, even entities that have mature incident response programs in place would benefit from giving those programs a fresh look to ensure they meet the Amendments’ lofty requirements. 

Jackson Lewis’ Financial Services and Privacy, Data, and Cybersecurity groups will continue to track this development.  Please contact a Jackson Lewis attorney with any questions.

On April 17, 2024, Nebraska’s governor signed Legislative Bill 1074, which establishes a consumer data privacy law for the state.

Nebraska’s law takes effect January 1, 2025.

To Whom does the law apply?

The law applies to businesses that:

  • Conduct business in Nebraska or produce a product or service consumed by residents of Nebraska.
  • Process or sell personal data of residents of Nebraska.
  • Are not a small business as defined under the federal Small Business Act.

Note that, unlike the comprehensive privacy laws in most other states, Nebraska’s law does not condition the application of the law on certain thresholds, such as the number of consumers from whom the entity collects personal information.

The statute also provides a combination of exemptions based on entity and type of data. Specifically, the statute excludes certain entities such as financial institutions subject to the Gramm-Leach-Bliley Act (GLBA), institutions of higher education, and entities that are covered entities and business associates covered by the Health Insurance Portability and Accountability Act (HIPAA). Examples of the types of personal information that are excluded from the law include protected health information covered by HIPAA and personal information regulated by the Fair Credit Reporting Act.

Who is protected by the law?

Consumer means an individual who is a resident of the State of Nebraska acting only in an individual or household context. The definition of consumer does not include an individual acting in a commercial or employment context.

What data is protected by the law?

Personal data is protected which is defined as any information that is linked or reasonably linked to an identified or identifiable individual. The law excludes de-identified data and publicly available information. The law also excludes personal data when in the context of commercial activities and employment.

What are the rights of consumers?

Under the law, consumers have the following rights:

  • To confirm whether a controller is processing their personal data.
  • To access personal data processed by a controller.
  • To correct inaccuracies in their personal data.
  • To delete personal data provided by or obtained about the consumers
  • To obtain a copy of their personal data that was previously provided to the controller
  • To opt out of the processing of personal data for the purposes of targeted advertising, the sale of their personal data, or profiling in furtherance of a decision that produces a legal or similarly significant effect concerning the consumer.

Similar to the frameworks established in other states to process requests from consumers concerning these rights, controllers are required to respond within certain timeframes (generally 45 days) and provide a mechanism for appealing the denial of a right.

What obligations do controllers have?

In addition to responding to requests from consumers seeking to exercise their rights, the law also requires that controllers provide consumers with a reasonably accessible and clear privacy notice that includes:

  • The categories of personal data processed by the controller
  • The purpose for processing the personal data
  • Information on how consumers may exercise their rights and appeal a controller’s decisions
  • The categories of data it shares and a description of at least two methods through which the consumer may use to submit a request to exercise a consumer right.
  • A description of its sale of personal information to third parties and processing of same for targeted advertising (including the process of opting out of that process).

Existing Nebraska law (Revised Statute 87-808) requires certain individuals and commercial entities in Nebraska to:

implement and maintain reasonable security procedures and practices that are appropriate to the nature and sensitivity of the personal information owned, licensed, or maintained and the nature and size of, and the resources available to, the business and its operations, including safeguards that protect the personal information when the individual or commercial entity disposes of the personal information.

The state’s comprehensive privacy law includes a similar obligation to maintain reasonable administrative, technical, and physical data security practices that are appropriate to the volume and nature of the personal data at issue. Additionally, the comprehensive privacy law provides that, in general, controllers may not:

Process personal data for a purpose that is neither reasonably necessary to nor compatible with the disclosed purpose for which the personal data is processed, as disclosed to the consumer unless the controller obtains the consumer’s consent [emphasis added].

This and other language in the statute may raise data minimization obligations similar to those recently addressed by the California Privacy Protection Agency

Additionally, controllers must enter into written agreements with processors that process personal information on behalf of the controller. Examples of required provisions in these agreements include:

  • Instructions for the processing of personal information
  • Ensure that any person at the processor responsible for processing personal information is subject to a duty of confidentiality;
  • Cooperate with the controller’s data protection assessments, or obtain its own assessments which includes a requirement to provide a report of the assessment to the controller on request;
  • At the controller’s direction, delete or return personal data at the termination of the agreement, unless retention is required by law.

How is the law enforced?

The State Attorney General has exclusive enforcement authority and there is no private right of action available.

If you have questions about Nebraska’s privacy law or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.

On June 11, 2024, the Consumer Financial Protection Bureau (CFPB) published a Notice of Proposed Rulemaking (NPRM) to amend Regulation V‒ which implements the Fair Credit Reporting Act (FCRA) ‒  limiting the inclusion of medical bills in consumer financial reports. This amendment, while providing significant benefits to Americans suffering significant medical debts, also may alter and reduce risk for employers who lawfully consider credit information as part of the pre-employment process.  

The consideration of medical debt information in making employment decisions has always been a concern of workplace regulatory agencies. The Equal Employment Opportunity Commission (EEOC), along with the Federal Trade Commission (FTC), released guidance to U.S. employers in 2014 on criminal and financial background checks. This guidance emphasizes how credit reports and criminal histories may influence employment decisions. Often, background checks can display an applicant’s race, ethnicity, gender, financial record, criminal history, genetic information, or disability. Because of the myriad of federal, state, and local laws and regulations, employers must be mindful of any “disparate impact” the practice of conducting background checks may impose on applicants if such information were to influence an adverse employment decision such as job rejection.  

Employers must also be aware of the risk of potential disparate treatment claims, i.e., intentional discrimination, arising out information learned during the background check process. Relevant to accessing medical debt information, importantly, the EEOC reminds employers not to try to obtain genetic information or family medical history, as those inquiries violate the Genetic Information Nondiscrimination Act (GINA). The 2014 guidance also encourages employers to “[b]e prepared to make exceptions for problems revealed during a background check that were caused by a disability.”  

The FTC, in that same 2014 guidance, reminds employers that they must provide notice (with specific reasons as to the rejection) and a copy of “A Summary of Your Rights Under the Fair Credit Reporting Act” before taking adverse action based on information revealed in a credit report. The CFPB’s proposed regulation therefore can reduce the risk of an employer having knowledge of potentially protected information. 

Until recently, medical debt has had damaging affects to millions of working-age Americans. A study conducted by the CFPB showed that Black and Latino Americans aged 30-44, as well as Americans living in southern states, are most likely to have medical debt reported on their credit history.  

CFPB’s newly proposed amendment to Regulation V, if adopted in its entirety, will alter the access to medical debt information in consumer financial reports. The proposal includes three major amendments to Regulation V: (1) the definition of medical debt information; (2) a removal of the financial information exception; and (3) restricting credit reporting agencies for consideration of medical debt in eligibility determinations. That said, credit reports will still include medical debts that are in default.  

What impact does this potential amendment have on employers? Considering a government guidance has been in place for over ten years by the EEOC and FTC, prudent employers are already minimizing their exposure to potential claims by considering mitigating factors relating to medical debts or not considering that factor at all. As such, the underlying information in medical bills that reveal genetic information, family medical history, or a disability should be considered confidential and not be considered when evaluating the qualifications of a job applicant. If CFPB’s amendments are therefore implemented, employers and job applicants benefit alike – employers  will ensure they are making decisions based on what is job related and consistent with business necessity irrespective of possible protected status, while the applicant no longer has to explain what might fall under a protected category when credit has been impacted by significant medical debt. Medical payments in default can still be considered, however the prudent employer can consider mitigating circumstances without delving into the underlying medical history. 

Special thanks to Giuseppina Mammoliti for her assistance with this article. 

With the Texas Data Privacy and Security Act (TDPSA) on the verge of taking effect on July 1, 2024, the State’s Attorney General, Ken Paxton, recently launched an initiative for “aggressive enforcement of Texas privacy laws.”  As part of the initiative, Paxton has established a team that will focus on the enforcement of Texas’ privacy protection laws, including the TDPSA, along with federal laws like the Children’s Online Privacy Protection Act (COPPA). 

Unlike most of the 15 plus states with comprehensive privacy laws that exclude from their scope organizations that do not meet significant data volume thresholds (e.g., processing data related to at least 100,000 state residents), the TDPSA, with limited exceptions, applies to any organization that conducts business in the state of Texas or produces a product or service consumed by Texas residents. In contrast to the California Consumer Privacy Act (CCPA), the TDPSA excludes Human Resources and Business to Business data. But aside from this exclusion, if an organization processes the personal data of consumers residing in Texas, there is a good chance it will be in scope.

Organizations that have programs in place to comply with the CCPA will have a head start toward compliance with the TDPSA.  That said, there are aspects of the TDPSA that differ from or go beyond the CCPA.  For instance, the TDPSA requires:

  • the inclusion of specific privacy policy disclosures related to the sale of biometric or sensitive personal data;
  • the collection of consent before processing personal data for previously undisclosed purposes or processing sensitive personal data;
  • data protection assessments in connection with processing sensitive personal data, selling personal data, or using it for targeted advertising;
  • the inclusion of specific provisions in vendor agreements; and
  • a mechanism for consumers to appeal the denial of their requests to exercise their TDPSA rights.   

For assistance bringing your organization into compliance with the TDPSA, please contact a member of our Privacy, Data, and Cybersecurity group.

When Colorado enacted the Colorado Privacy Act (CPA), it included “biometric data that may be processed for the purpose of uniquely identifying an individual.” However, the CPA as originally drafted did not cover the personal data of individuals acting in a commercial or employment context. Last week, Colorado amended the CPA to broaden the protections for biometric data when Gov. Jared Polis signed HB-1130 into law.

Application of the CPA Biometric Amendment. Importantly, HB-1130 alters the scope of the CPA’s application. Recall that under the CPA, a controller is subject to the CPA if it:

(i) determines the purposes and means of processing personal data, (ii) conducts business in Colorado or produces or delivers commercial products or services intentionally targeted to residents of the state, and (iii) either:  (a) controls or processes the personal data of more than 100,000 Colorado residents per year or (b) derives revenue from selling the personal data of more than 25,000 Colorado residents.

HB-1130 adds that a controller can be subject to the CPA without meeting the requirements above, provided that it would be subject to the CPA solely to the extent that it controls or processes any amount of biometric identifiers or biometric data.

Key Definitions. The amendment added language expressly applicable to employers, including defining employees to include not only individuals employed on a full or part time basis, but also individuals who are “on-call” or hired as a “contractor, subcontractor, intern, or fellow.” The amendment also adds definitions for biometric data and biometric identifier,

“Biometric data” means one or more biometric identifiers that are used or intended to be used, singly or in combination with each other or with other personal data, for identification purposes. “Biometric data” does not include the following unless the biometric data is used for identification purposes: (i) a digital or physical photograph; (ii) an audio or voice recording; or (iii) any data generated from a digital or physical photograph or an audio or video recording.

“Biometric identifier” means data generated by the technological processing, measurement, or analysis of a consumer’s biological, physical, or behavioral characteristics, which data can be processed for the purpose of uniquely identifying an individual. “Biometric identifier” includes: (a) a fingerprint; (b) a voiceprint; (c) a scan or record of an eye retina or iris; (d) a facial map, facial geometry, or facial template; or (e) other unique biological, physical, or behavioral patterns or characteristics.

While there are some similarities in these definitions to the corresponding definitions in the popular Illinois Biometric Information Privacy Act (BIPA), there are some significant differences. One is that a biometric identifier under the BIPA is defined as a “retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.” The Illinois law does not make reference to “other unique biological, physical, or behavioral patterns or characteristics.” There is also not a private right of action for violations of the CPA amendment, as there is in the BIPA.

Requirements. HB-1130 establishes several requirements for controllers that control or process one or more biometric identifiers. These requirements include:

  • Obtaining consent from the consumer (including the employee) before collecting the consumer’s biometric data.
  • A written policy that
    • Establishes a retention schedule for biometric identifiers and biometric information,
    • Includes a process for responding to the data security incident that would compromise the security of biometric identifiers or biometric information. This would include the process for notifying consumers under the state’s existing data breach notification law.
    • Establishes guidelines addressing the deletion biometric identifiers within certain time frames.
  • Subject to certain exceptions, controllers must make the written policy available to the public. One exception is for a policy applying only to current employees of the controller.
  • Providing a reasonably accessible, clear, and meaning privacy notice satisfying specific content requirements including the purposes for processing.
  • Satisfying certain rights the consumer may have with respect to their biometric data, including the right to access.

HB-1130 also prohibits controllers from certain activities concerning biometric identifiers such as:

  • Selling, leasing or trading such information,
  • Disclosing biometric identifiers, subject to limited exceptions including consent and complying with federal or state law.
  • Refusing to provide a good or service to a consumer, based on the consumer’s refusal to consent to the controller’s collection, use, disclosure, etc. of a biometric identifier unless same is necessary to provide the good or service.

Controllers and processors also must use a reasonable standard of care when storing, transmitting, and protecting biometric identifiers from disclosure.

Employment provisions. HB-1130 includes certain specific provisions for employers. While the law provides that employers may require current or prospective employees to allow the employer to collect and process their biometric identifiers, they may do so only to

  • Permit access to secure physical locations and secure electronic hardware and software applications (but not obtain consent to retain such data for current employee location tracking or tracking time using a hardware or software application),
  • Record the commencement and conclusion of the employee’s full workday, including meal breaks and rest breaks in excess of 30 minutes,
  • Improve or monitor workplace safety or security or ensure the safety or security of employees,
  • Improve or monitor the safety or security of the public in the event of an emergency or crisis situation.

Collecting or processing biometric identifiers for other purposes will require consent which satisfied the applicable CPA requirements. However, employers will be able to collect and process biometric identifiers where the anticipated uses are “aligned with the reasonable expectations” of an employee based on the employee’s job description or role, or a prospective employee based on reasonable background check, application or identification requirements.

Organizations that collect and process information that could be considered biometric identifiers or biometric data in various jurisdiction around the country will need to do a detailed analysis of the growing privacy and cybersecurity obligations, including incident response requirements. For assistance with that, please see our biometric law map.    

In 2021, the Department of Labor (DOL) issued cybersecurity guidance for ERISA-covered retirement plans. The guidance expands the duties retirement plan fiduciaries have when selecting service providers. Specifically, the DOL makes clear that when selecting retirement plan service providers, plan fiduciaries must prudently assess the cybersecurity of those providers.  

On May 15, 2024, the Securities and Exchange Commission (SEC) adopted amendments to Regulation S-P which governs the treatment of nonpublic personal information about consumers by certain financial institutions, many of which are commonly vendors and service providers to retirement plans. For example, the amendments reach broker-dealers, investment companies, registered investment advisers, and transfer agents. Importantly, the amendments establish specific cybersecurity requirements for these entities, requirements that retirement plan fiduciaries should be aware of.

Some of the key requirements include:

  • Incident Response Program:
  • Covered institutions must develop, implement, and maintain written policies and procedures for an incident response program.
  • The program should be reasonably designed to detect, respond to, and recover from unauthorized access to or use of customer information.
  • Notice Requirements:
    • Covered institutions must provide notice to individuals whose sensitive customer information was accessed or used without authorization.
    • The notice must include details about the incident, breached data, and steps affected individuals can take to protect themselves.
    • Notice must be provided as soon as practicable, but not later than 30 days after becoming aware of the incident.
  • Service Provider Oversight
    • Covered institutions establish, maintain, and enforce written policies and procedures reasonably designed to require oversight including through due diligence and monitoring of service providers.

The amendments also set forth requirements for maintaining written records document compliance with the requirements. There are different requirements for the retention period depending on the type of covered institution, but the minimum is at least 2 years.

The amendments become effective 60 days after publication in the Federal Register. Larger entities will have 18 months after the date of publication in the Federal Register to comply with the amendments, and smaller entities will have 24 months after the date of publication in the Federal Register to comply.

When assessing the cybersecurity of a retirement plan service provider that is a financial institution, plan fiduciaries may want to be aware of these requirements as part of their assessment process. For example, the changes to the SEC requirements for incident reporting may be useful to retirement plan sponsors as they consider their own incident response plans, should a data breach experienced by a 401(k) plan involve the data of their current and former employees.  

If you have questions about steps plan fiduciaries should be thinking about when assessing service providers to their plans, including the potential impact of the SEC amended Regulation S-P contact a member of Jackson Lewis’ Privacy, Data, and Cybersecurity practice group to discuss.

Last year the White House weighed in on the use of artificial intelligence (AI) in businesses.

Since the executive order, several government entities including the Department of Labor have released guidance on the use of AI.

And now the White House published principles to protect workers when AI is used in the workplace.

The principles apply to both the development and deployment of AI systems. These principles include:

  • Awareness – Workers should be informed of and have input in the design, development, testing, training, and use of AI systems in the workplace.
  • Ethical development – AI systems should be designed, developed, and trained in a way to protect workers.
  • Governance and Oversight – Organizations should have clear governance systems and oversight for AI systems.
  • Transparency – Employers should be transparent with workers and job seekers about AI systems being used.
  • Compliance with existing workplace laws – AI systems should not violate or undermine worker’s rights including the right to organize, health and safety rights, and other worker protections.
  • Enabling – AI systems should assist and improve worker’s job quality.
  • Supportive during transition – Employers support workers during job transitions related to AI.
  • Privacy and Security of Data – Worker’s data collected, used, or created by AI systems should be limited in scope and used to support legitimate business aims.

If you have questions about the federal government’s guidance pertaining to the use of AI in the workplace or related issues, contact a Jackson Lewis attorney to discuss.

On May 1, 2024, amendments to Utah’s cybersecurity and data breach notification law took effect.

The state’s cybersecurity and data breach notification law requires an organization that conducts business in the State of Utah to prevent the unlawful use or disclosure of personal information collected by the organization.

Under the requirements, if an organization that owns or maintains the personal information of a Utah resident becomes aware of a breach of system security the organization must investigate to determine if the personal information has been or will be misused. If misuse has occurred or is likely to occur, the organization must notify every affected Utah resident. And if 500 or more Utah residents are affected the organization must notify the Utah Attorney General’s Office and the Utah Cyber Center. The Utah Cyber Center coordinates efforts between state, local, and federal resources to support security and defend against cyber-attacks.

The recent amendments revise the definition of “personal data” to be information that “is linked or can be reasonably linked” to an identified individual or identifiable individual.

Concerning nongovernmental entities, the amendments implement a definition for the term “data breach” which is now defined as the “unauthorized access, acquisition, disclosure, loss of access, or destruction of” the personal data of more than 500 or more individuals; or, of data that “compromises security, confidentiality, availability, or integrity of the computer system in use or information maintained by a governmental entity.”

The amendments reiterate that the disclosure of a breach may be confidential and classified as a protected record.

The amendments require reporting entities to include additional information in breach notifications including:

  •  the date the breach of the system security occurred;
  • the date the breach was discovered;
  • the total number of people impacted by the breach, with a breakout of the total number of Utah residents;
  • the type of personal information involved in the breach; and,
  •  a short description of the breach that occurred.

Utah also revised reporting requirements for governmental entities that discover a data breach. Governmental entities shall include all of the above reference items when reporting to the Cyber Center and also:

  • The path or means by which access was gained to the system, computer, or network if known
  • The individual or entity who perpetrated the data breach, if known
  • Any other details requested by the Cyber Center

If you have questions about Utah’s breach notification requirements or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.

As reported by CNN, a high school principal in Pikesville, Maryland, found his life and career turned upside down when in January a recording suggesting the principal made racially insensitive and antisemitic remarks went viral. The school faced a flood of calls from concerned persons in the district, security was tightened, and the principal was placed on administrative leave. No doubt, a challenging situation for any human resources executive, one made far more difficult because of AI.

An investigation ensured, all the while the school principal maintained that he did not make the statements in the recording – it was not his voice, he claimed. Of course, the “recording” was good enough to put the school district on edge.  

It was not until months later, in late April, that a Baltimore County Police Department investigation concluded that the recording was a fake, a “deepfake,” generated by artificial intelligence (AI) technology. As reported by CNN, Baltimore’s County Executive, Johnny Olszewski, observed:

“Today, we are relieved to have some closure on the origins of this audio…However, it is clear that we are also entering a new, deeply concerning frontier.”

Deepfake AI is a type of artificial intelligence used to create convincing images, audio and video hoaxes. Although deepfakes might have some utility, such as for entertainment purposes, they blur the lines between reality and fiction, making it increasingly difficult to discern truth from falsehood. As in the case of the Baltimore school principal, misuse raises significant concerns, particularly in the workplace. It turns out that the deepfake recording may have arisen from an employment dispute that the principal was having with the high school’s athletic director.

The US Department of Homeland Security and other agencies have recognized the threat deepfakes present. At the same time, the technology is getting easier and easier to use and harder to identify. In this case, it took three months for the Baltimore County Policy Department to investigate and make a determination about the recording.

The World Economic Forum’s “4 ways to future-proof against deepfakes in 2024 and beyond” offers a sobering suggestion for dealing with deepfakes – zero-trust.

This mindset aligns with mindfulness practices that encourage individuals to pause before reacting to emotionally triggering content and engage with digital content intentionally and thoughtfully.

This may not be the mindset most HR professionals prefer to have at or near the top of their lists. But in this context, when presented with electronic material or even a photograph from an unknown source, despite how real it might appear, intentionality and thoughtfulness should prevail.

Consider being presented, as here, with a video, a recording, or some other image, photograph, or transcribed conversation, containing insensitive remarks purportedly made by an employee about another’s race, religion, gender, etc. An organization might not have a police department that is willing and able to assist, although they might have just as much pressure in the workplace from persons reacting to the content, believing it is authentic when it may be nothing more than a fake. Having an internal plan outlining a process for investigation and resources (internal or external) lined up to evaluate the material would help to ensure that intentionality and thoughtfulness. Such a plan might also guide decision making around the various employment decisions to be made along the way, including internal and external communications, should the investigation carry on.

This is only the beginning.