On August 2, 2024, Governor Pritzker signed Senate Bill (SB) 2979, which amends the Illinois Biometric Information Privacy Act, 740 ILCS 14/1, et seq. (BIPA). The bill, which passed both the Illinois House and Senate by an overwhelming majority, confirms that a private entity that more than once collects or discloses the same biometric identifier or biometric information from the same person via the same method of collection in violation of the Act has committed a single violation for which an aggrieved person is entitled to, at most, one recovery. SB 2979 adds the following clarifying language into Section 20 of the BIPA, which is the section of the statute that identifies the damages a prevailing party mayrecover under the Act:

(b) For purposes of subsection (b) of Section 15, a private entity that, in more than one instance, collects, captures, purchases, receives through trade, or otherwise obtains the same biometric identifier or biometric information from the same person using the same method of collection in violation of subsection (b) of Section 15 has committed a single violation of subsection (b) of Section 15 for which the aggrieved person is entitled to, at most, one recovery under this Section.

(c) For purposes of subsection (d) of Section 15, a private entity that, in more than one instance, discloses, rediscloses, or otherwise disseminates the same biometric identifier or biometric information from the same person to the same recipient using the same method of collection in violation of subsection (d) of Section 15 has committed a single violation of subsection (d) of Section 15 for which the aggrieved person is entitled to, at most, one recovery under this Section regardless of the number of times the private entity disclosed, redisclosed, or otherwise disseminated the same biometric identifier or biometric information of the same person to the same recipient.

The amendment takes effect immediately.

Background

In Cothron v. White Castle System, Inc., 2023 IL 128004, the Illinois Supreme Court held that claims under Sections 15(b) and (d) of the BIPA accrue “with every scan or transmission” of alleged biometric identifiers or biometric information.  Yet, the Illinois Supreme Court, in deciding the issue of claim accrual under Sections 15(b) and (d) of the BIPA, acknowledged that there was some ambiguity about how its holding should be construed in connection with Section 20 of the BIPA, which outlines the damages that a prevailing party may recover. Notably, the Illinois Supreme Court acknowledged, “there is no language in the Act suggesting legislative intent to authorize a damages award that would result in the financial destruction of a business,” which would be the result if the legislature intended to award statutory damages on a “per-scan” basis. The Court went on to say that “policy-based concerns about potentially excessive damage awards under the Act are best addressed by the legislature” and expressly “suggest[ed] that the legislature review these policy concerns and make clear its intent regarding the assessment of damages under the Act.”

SB 2979 was introduced in the Illinois Senate on January 31, 2024, in response to the invitation from the Illinois Supreme Court and clarifies the General Assembly’s intention regarding the assessment of damages under the BIPA.

Electronic Signatures

In addition, the bill also adds “electronic signature” to the definition of written release, clarifying that an electronic signature constitutes a valid written release under Section 15(b)(3) of the BIPA. An electronic signature is defined in SB 2979 as “an electronic sound, symbol, or process attached to or logically associated with a record and executed or adopted by a person with the intent to sign a record.”

If you have questions about SB 2979 or related issues, please contact a member of our Privacy, Data, and Cybersecurity group.

Virtually all organizations have an obligation to safeguard their personal data against unauthorized access or use, and, in some instances, to notify affected individuals in the event such access or use occurs.  Those obligations are, in some instances, relatively nebulous, and organizations—for better or worse—have flexibility to determine what pre-incident safeguards and post-incident responsive actions are “reasonable” under the circumstances. 

The SEC, in its recent amendments to Regulation S-P (the Amendments), takes a different approach.  The Amendments impose detailed and specific obligations on covered institutions—including broker-dealers, investment companies, registered investment advisers, and transfer agents—to (1) develop and maintain written incident response programs and (2) provide notification to affected individuals in the event their sensitive customer information is subject to unauthorized access or use (a Data Breach)

Incident Response Program

The Amendments require covered institutions to develop and maintain written information response programs.  The function of these programs is to enable covered institutions to better detect and respond to Data Breaches, including by facilitating their:

  • assessment of the nature and scope of these incidents, including identification of the internal systems containing customer information and the types of customer information that may have been accessed or used without authorization.  The Amendments indicate that covered institutions when assessing an incident, should consider the type and extent of the unauthorized access, the impact on operations, and whether information has been exfiltrated or is no longer accessible;
  • containment and control of the incident to prevent further unauthorized access to or use of customer information.  The Amendments acknowledge that the appropriate steps for containing and controlling an incident will vary based on its nature, but identify the following as potential key action items: isolation of affected systems, enhancement of system monitoring, identifying additional compromised systems, forcing password resets, and changing or disabling default user accounts; and
  • notification to individuals whose “sensitive customer information” (defined below) was, or is reasonably likely to have been, accessed or used without authorization.

Notably, while the foregoing incident response program requirements apply to all consumer “nonpublic personal information”—a broad category encompassing all personally identifiable financial information a financial institution collects about an individual in connection with providing a financial product or service—the notification obligations discussed below are limited to incidents impacting “sensitive customer information.”

Notification to Affected Individuals

Covered institutions must provide notice to each affected individual whose sensitive customer information was, or is reasonably likely to have been, subject to a Data Breach.  “Sensitive customer information” includes:

  • information uniquely identified with an individual, such that it can reasonably be used to authenticate the individual’s identity;
  • government-issued identification numbers, including a social security number, driver’s license number, alien registration number, passport number, or employer or taxpayer identification number;
  • a biometric record;
  • a unique electronic identification number, address, or routing code;
  • telecommunication identifying information or access device; or
  • information identifying an individual or an individual’s account, including an account number, name, or online username, in combination with other authenticating information that could be used to gain access to an individual’s account.

In the event of a Data Breach, the Amendments require covered institutions to provide clear and conspicuous notice “as soon as practicable,” but not later than 30 days after their discovery of the breach.  Notice to affected individuals must include the following:

  • a general description of the incident and type of sensitive customer information affected;
  • the date (or estimated date/date range) of the incident;
  • contact information notice recipients can utilize to obtain more information about the incident; and
  • steps affected individuals can take to protect their information, including how they can obtain free credit reports, place fraud alerts on their accounts, and review their account statements for suspicious activity.

Under the Amendments, unauthorized access to or use of sensitive customer information does not always trigger the obligation to notify.  Notice is not required if, after a reasonable investigation of relevant facts and circumstances, the covered institution determines that the sensitive customer information in question has not been, and is not reasonably likely to be, used in a manner resulting in substantial harm or inconvenience (e.g. because it was protected by encryption).  The Amendments indicated that, if a covered institution reasonably determines that a specificindividual’s sensitive customer information was not accessed or used without authorization, it does not need to notify that individual.  However, if the covered institution is unable to identify which specific individual’s sensitive customer information has been accessed or used, it must notify all individuals whose information resided on the impacted information system.

Implementation

The Amendments will take effect in early August 2024, but covered entities—depending on their size—will have an 18- or 24-month grace period to come into compliance.  Larger entities, which are defined below, will need to come into compliance by December 2025, while smaller entities will have until June 2026.    

EntityQualification to be Considered a Larger Entity
Investment companies together with other investment companies in the same group of related investment companiesNet assets of $1 billion or more as of the end of the most recent fiscal year.
Registered investment advisers$1.5 billion or more in assets under management.
Broker-dealersAll broker-dealers that are not small entities under the Securities Exchange Act for purposes of the Regulatory Flexibility Act.
Transfer agentsAll transfer agents that are not small entities under the Securities Exchange Act for purposes of the Regulatory Flexibility Act.

Takeaways

Though the grace periods will likely lull some entities into near-term complacency—believing they have plenty of time to get their houses in order—prudent entities will place compliance with the Amendments high on their task lists. 

For entities that haven’t already made a significant investment in their incident response programs, development of the robust program the Amendments require will be a heavy lift.  Compliance with the assessment component, for instance, may require entities to conduct extensive data mapping to better understand what data they have, where it’s stored, how it’s safeguarded, and how long it’s retained. 

They may also need to take a close look at their current controls to detect and rapidly investigate and respond to potential Data Breaches, including those that enable the isolation of affected systems, the identification and eradication of ongoing malicious activity, and the restoration of business operations, including potential data recovery from backups. 

Covered entities will also need to prepare to analyze their notification obligations and timely provide requisite notices. 

To many, the above requirements will sound familiar, as they overlap to a degree with obligations imposed by state reasonable safeguard and breach notification laws.  The Amendments’ incident response plan prescriptions, however, are more detailed and onerous than the requirements imposed by most state laws, and their definition of “sensitive customer information” is broader than the definition of “personally identifiable information” (or the comparable term) in most states.  Accordingly, even entities that have mature incident response programs in place would benefit from giving those programs a fresh look to ensure they meet the Amendments’ lofty requirements. 

Jackson Lewis’ Financial Services and Privacy, Data, and Cybersecurity groups will continue to track this development.  Please contact a Jackson Lewis attorney with any questions.

With the Texas Data Privacy and Security Act (TDPSA) on the verge of taking effect on July 1, 2024, the State’s Attorney General, Ken Paxton, recently launched an initiative for “aggressive enforcement of Texas privacy laws.”  As part of the initiative, Paxton has established a team that will focus on the enforcement of Texas’ privacy protection laws, including the TDPSA, along with federal laws like the Children’s Online Privacy Protection Act (COPPA). 

Unlike most of the 15 plus states with comprehensive privacy laws that exclude from their scope organizations that do not meet significant data volume thresholds (e.g., processing data related to at least 100,000 state residents), the TDPSA, with limited exceptions, applies to any organization that conducts business in the state of Texas or produces a product or service consumed by Texas residents. In contrast to the California Consumer Privacy Act (CCPA), the TDPSA excludes Human Resources and Business to Business data. But aside from this exclusion, if an organization processes the personal data of consumers residing in Texas, there is a good chance it will be in scope.

Organizations that have programs in place to comply with the CCPA will have a head start toward compliance with the TDPSA.  That said, there are aspects of the TDPSA that differ from or go beyond the CCPA.  For instance, the TDPSA requires:

  • the inclusion of specific privacy policy disclosures related to the sale of biometric or sensitive personal data;
  • the collection of consent before processing personal data for previously undisclosed purposes or processing sensitive personal data;
  • data protection assessments in connection with processing sensitive personal data, selling personal data, or using it for targeted advertising;
  • the inclusion of specific provisions in vendor agreements; and
  • a mechanism for consumers to appeal the denial of their requests to exercise their TDPSA rights.   

For assistance bringing your organization into compliance with the TDPSA, please contact a member of our Privacy, Data, and Cybersecurity group.

“Cybersecurity” has emerged as one of top risks facing organizations. Considering the steady stream of massive data breaches affecting millions (sometimes billions), the debilitating effects of ransomware on an organization’s information systems, the intrigue of international threat actors, and the mobilization and collaboration of national law enforcement to thwart these attacks, it’s no wonder. Notions of privacy have long underpinned critical principles and rights in our legal system, yet actors in the space typically do not have names like LockBit or Black Basta using applications called Colbalt Strike, and [yawn] may not trigger concerns as seemingly compelling as cybersecurity. But that may be changing, at least in the minds of insurance underwriters and persons focused on compliance.

As a recent DarkReading article points out, there is a growing sense that the “mishandling [of] protected personally identifiable information (PII) could rival the cost of ransomware attacks.” The article discusses several reasons driving this view, citing among other things, the recent uptick in pixel litigation. That is,  litigation concerning the handling of website users’ personal information obtained from tracking technologies on websites without consent.

However, the article also alludes to the vast patchwork of nuanced privacy laws across numerous jurisdictions as support for an increasing number of insurance professionals viewing privacy as the “top insurance concern.” In addition to the onslaught of litigation over the use of website tracking technologies, the challenges of navigating the ever expanding and deepening maze of privacy law seem to present much greater compliance and litigation risks for organizations.

A Insurance Journal article, “The Cyber Risk Pendulum,” echoed these sentiments earlier this month and observed:

In 2024, there is a greater focus [by carriers] on controls related to “wrongful collection” coverage – the collection of data in a manner that could run afoul of privacy regulations – whether it be on a state or federal level.

This makes sense considering the emergence of state comprehensive privacy laws, most notably the California Consumer Privacy Act (CCPA). Consider that the first “Enforcement Advisory” issued by the California Privacy Protection Agency, the agency charged with enforcing the CCPA, focuses on “data minimization” – a requirement that includes assessing the collection, use, retention, and sharing of personal information from the perspective of minimizing the personal information processed for the intended purpose(s).   

For many organizations, different privacy laws can apply depending on a range of factors, including without limitation: industry, business location, categories of customers, types of equipment used, specific services provided, methods of marketing and promotion, the categories of information collected, and employment practices.

Consider a health care organization:

  • Industry: Of course, most if not all have at least heard of the Health Insurance Portability and Accountability Act (HIPAA). Covered entities and business associates (defined terms under HIPAA generally including healthcare providers and service providers to those entities) must comply with a comprehensive set of privacy regulations regulating the use and disclosure of all protected health information, regardless of format.
  • Where it does business: All states have long-standing health laws regulating the use and disclosure of patient medical information. Indeed, HIPAA provides that covered entities and business associates have to comply with more stringent state laws that conflict with HIPAA, a particular challenge for multi-state organizations. In addition to state health laws affecting the use and disclosure of patient information, common law privacy rights and obligations also need to be considered.
  • Types of customers: A healthcare provider might provide services to or on behalf of government entities, in which case it may have to comply with certain contractor mandates. Or, it may focus its health services on minors versus adults, requiring it to understand, for example, the specific rules around consent pertaining to medical information pertaining to minors. Mental healthcare providers may have an additional layer of privacy obligations concerning their patients.
  • Equipment it uses: Whether dealing with medical devices, GPS tracking of vehicles, biometric devices used to verify access certain drugs, or smart cameras for facility surveillance, healthcare organization must consider the privacy issues related to the different types of equipment used in the delivery of care and operations. The increasing use of biometrics, as one example, has become a major risk in and beyond the healthcare industry, particularly in Illinois. By some counts, alleged violations of the Illinois Biometric Information Privacy Act (BIPA) have led to nearly 2,000 putative class action cases. The BIPA, a privacy statute, creates a remedy for, among other things, failing to obtain a consent or written released in connection with collecting a biometric identifier or biometric information.
  • Types of services:
    • University hospitals, for example, also have compliance obligations under the Family Educational Rights and Privacy Act (FERPA).
    • Providers running certain federally assisted programs involving substance use services must comply with the substance abuse confidentiality regulations issued by the Substance Abuse and Mental Health Services Administration. See 42 USC Part 2 (although recent regulations finalized in February strive to align these two privacy frameworks).
    • When treating certain highly contagious diseases, providers also must consider laws regulating the use and disclosure of information related to those diseases which often provider stronger protections and limitations on disclosure.
    • A healthcare provider that performs genetic testing services must consider the applicable genetic information privacy laws, which exist in just about all 50 states. One such law is the Illinois Genetic Information Privacy Act (GIPA) passed in 1998. This law may become the next significant privacy target for the Illinois plaintiffs’ bar. Arguably more nuanced than its sister statute, the BIPA, the GIPA has been the subject of an increasing number of case filings in the past year. Compliance can be challenging. For example, the GIPA incorporates some familiar laws – GINA, ADA, Title VII, FMLA, OSHA, and others – requiring that certain entities, including employers, treat genetic testing and genetic information (including certain family medical history information) in a manner consistent with such laws. So, it is not just the GIPA that organizations need to worry about in order to comply with the GIPA.
  • Marketing its services: In addition to the use of tracking technologies referenced above, other means of collecting and sharing personal information to promote the organization’s business may have significant privacy consequences under federal and state consumer protection laws. Examples include emailing and texting, use of employee and patient images and likeness in advertisements, and sharing personal information with third parties in connection with marketing and promotion activities.
  • Categories of personal information: Not all “personal information” is the same. The post at the link just scratches the surface on the various definitions of data that may drive different compliance obligations, including for healthcare organizations.
  • Employment practices: The processing of personal information pertaining to employees, applicants, contractors, etc. creates an additional layer of privacy obligations that touch on many of the items noted above. Areas of particular concern include – increasing use of AI in hiring and promotion, workplace surveillance, methods of identity verification, managing employee medical information, and maintaining employee benefit plans. Each of these areas raise particular issues under federal and/or state law and which are shaped by the categories of information at issues.

Attempting to track, never mind become compliant with, the various privacy laws affecting each of these facets of the business is no easy task. We have not even considered the broader and more detailed and comprehensive privacy frameworks established internationally, such as the EU General Data Protection Regulation (GDPR). And, of course, it is not just healthcare providers that face these privacy challenges at various levels of their operations. Keeping information secure from cyberattacks is one thing and it too is quite challenging, but there are established frameworks for doing so that share many common threads. In the case of privacy, there seems to be many more subtle considerations that are critical for compliance.

For instance, in most cases establishing a password policy under a cybersecurity law to protect personal information is solving for one issue – requiring persons to develop a relatively strong password that will make it difficult for an unauthorized person to gain access the protected system. This may be oversimplifying, but the point is a good password policy might suffice under many different cybersecurity laws, regardless of state, type of business, category of data, etc. Complying with a privacy law regulating the disclosure of health information, on the other hand, likely will require several factors be considered: the type of entity, where it does business, the specific type of data, the individual’s age or medical condition, the reason for the disclosure, the intended recipient, etc.

Regulatory compliance is not the end of the story for privacy. For example, organizations can cause self-inflicted wounds when they make assertions about the handling and safeguarding of the personal information they collect, and fail to meet those assertions. A good example is the privacy policy on an organization’s website. Stating in such a policy that the organization will “never” disclose the personal information collected on the site may create a binding obligation on the organization, even if there is not a law that requires such a rule concerning disclosure. Check out the Federal Trade Commission’s enforcement of these kinds of issues in its recently issued 2023 Privacy and Data Security Update.

Is privacy a bigger risk than cyber? Maybe. Regardless, trying to keep track of and comply with the wide range of privacy law is no easy task, particularly considering so much of the application of those laws are determined by many factors. For this reason, it is not hard to see why underwriters may view privacy as their top concern, and why organizations need trusted and experienced partners to help navigate the maze.   

On April 4, 2024, Kentucky’s Governor signed House Bill 15, which establishes a consumer data privacy law for the state. The state joins New Hampshire and New Jersey in passing comprehensive consumer privacy laws in 2024. Kentucky’s law takes effect January 1, 2026.

To whom does the law apply?

The law applies to persons, hereafter referred to as controllers, that conduct business in Kentucky or produce products or services that are targeted to residents of Kentucky and during a calendar year control or process personal data of at least:

  • 100,000 consumers; or
  • 25,000 consumers and derive over 50% of gross revenue from the sale of personal data.

Who is protected by the law?

A consumer protected under the new legislation is defined as a natural person who is a resident of Kentucky, acting in an individual context. A consumer does not include a person acting in a commercial or employment context.  

What data is protected by the law?

The legislation protects personal data defined as information that is linked or reasonably linkable to an identified or identifiable natural person.

Sensitive data is defined under the law as personal data indicating racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, or citizenship or immigration status. It also includes the processing of genetic or biometric data that is processed to uniquely identify a specific natural person; personal data of a minor, or premise geolocation data.

What are the rights of consumers?

Under the law, consumers have the following rights:

  • To confirm whether a controller is processing their personal data
  • To correct inaccurate personal data
  • To delete personal data maintained by the controller
  • To opt-out of processing of personal data for targeted advertising, sale, or certain profiling

What obligations do controllers have?

Under the legislation, controllers must:

  • Establish, implement, and maintain reasonable administrative, technical, and physical data security practices;
  • Limit the collection of personal data to what is adequate, relevant, and reasonably necessary in relation to purpose
  • Obtain consent from consumers before processing sensitive data concerning the consumer.

How is the law enforced?

The Attorney General has exclusive authority to enforce violations of the legislation. The law does provide for a 30-day right to cure violations by controllers and processors of data.

If you have questions about Kentucky’s privacy law or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.

A manager texting one of his drivers who covered the truck’s inward facing camera while stopping for lunch – “you can’t cover the camera it’s against company rules” – is not unlawful under the National Labor Relations Act (NLRA), according to a recent decision by the D.C. Circuit Court of Appeals.

A practice that has a reasonable tendency to coerce employees in the exercise of their rights under the NLRA is unlawful, according to National Labor Relations Board (NLRB) precedent. An employer’s creating an impression that it is surveilling employees while exercising their rights under the NLRA may constitute such coercion, according to the NLRB. In Stern Produce Co., Inc. v. NLRB, the Board argued the manager’s texting created such an impression. The D.C. Circuit Court of Appeals disagreed.

Like many companies managing a fleet of vehicles, in this case delivery trucks, Stern Produce Co. equips its trucks with dash-cams and telematics technologies. These systems can serve important functions for businesses – help to ensure safe driving, protect drivers and the businesses from liability for accidents for which they are not at fault, improve efficiencies through tracking location, etc. They also raise significant privacy issues, not the least of which is through inward facing cameras.

Stern required drivers to keep truck dash-cams on at all times, unless authorized to turn them off. While driving a truck for Stern, Ruiz parked for a lunch break and covered the truck’s inward facing camera. Hours later, Ruiz’s manager sent him a text: “Got the uniform guy for sizing bud, and you can’t cover the camera it’s against company rules.”

Perhaps in a move to further the positions outlined in a November 2022 memorandum concerning workplace surveillance, the Board’s General Counsel issued a complaint, alleging that the text created an impression of surveillance of organizing activities by making Ruiz aware that he was being watched. According to the Administrative Law Judge, the text did not create an impression of surveillance, but amounted to “mere observation” which was consistent with “longstanding company policies” about truck cameras. Those policies included Stern’s handbook which reserved for Stern the right to “monitor, intercept, and/or review” any data in its systems and to inspect company property at any time without notice. The handbook instructed drivers that they “should have no expectation of privacy” in any information stored or recorded on company systems, including “[c]losed-circuit television” systems, or in any company property, including vehicles. The company also maintained a manual for drivers that addressed the telematics and dash-cam technologies in their trucks. Specifically, the manual states that “[a]ll vehicle safety systems, telematics, and dash-cams must remain on at all times unless specifically authorized to turn them off or disconnect.”

The Board disagreed. Ruiz was a known supporter of a union organizing drive and had previously been subjected to unfair labor practices. Due in part to this history, the Board held the surveillance was “out of the ordinary” and argued the manager had no justification for reviewing the camera as he had done so in the past only in connection with safety concerns.    

Stern’s handbook and driver manual proved to be important to the D.C. Circuit’s analysis. The court noted that drivers were aware of the potential monitoring through the dash-cams and that those cameras must remain on at all times. The Board’s position that there was no evidence that Ruiz knew these policies when he covered the camera was “nonsense,” according to the court. Beyond the policies, the court reasoned that a driver would not have a basis to believe he was being monitored for organizing activities when (i) the driver knew he could be monitored in the vehicle at all times, and (ii) there was no evidence of union activity going on in the small cab of a delivery truck.

It is worth noting that the court recognized that elevated or abnormal scrutiny of pro-union employees can support a finding of impressions of surveillance. That was not the case here, even with Ruiz being a known supporter of union organizing efforts. The manager’s one-time, brief text was, according to the court, consistent with company policy, and did not suggest Ruiz was singled out for union activity. The Board did not satisfy the coercion element.

Takeaways from this case

The ubiquity and sophistication of dash-cams and similar monitoring and surveillance technologies raise a host of legal, compliance, and other issues, both in and outside of a labor management context. While focused on a potential violations of a worker’s rights under the NLRA, there are several key takeaways from this case beyond labor relations.

  • Understand the technology. This case considered a relatively mundane feature of today’s dash-cams – video cameras. However, current dash-cam technology increasingly leverages more sophisticated technologies, such as AI and biometrics. Decisions to adopt and deploy devices so equipped should be considered carefully.
  • Assess legal and compliance requirements. According to the court in this case, the policies adopted and communicated by the employer were adequate to apprise employees of the vehicle monitoring and mandatory video surveillance in the vehicle. However, depending on the circumstances, more may have been needed. The particular technology at issue and applicable state laws are examples of factors that could trigger additional legal requirements. Such requirements could include (i) notice and policy obligations under the California Consumer Privacy Act, (ii) notice requirements for GPS tracking in New Jersey, (iii) potential consent requirements for audio recording, and (iv) consent requirements for collection of biometrics.
  • Develop and communicate clear policies addressing expectation of privacy. Whether employees are working in the office, remotely from home, or in a vehicle, having clear policies concerning the nature and scope of permissible workplace monitoring is essential. The court in Stern relied on the employer’s policies significantly in finding that it has not violated the NLRA.
  • Provide guidance to managers. Maintaining the kinds of written policies discussed above may not be enough. The enforcement such policies, particularly in the labor context, also could create liability for employers. In this case, more aggressive actions by the manager directed only at Ruiz could have created an impression of surveillance that coerced the employee in the exercise of his rights. Accordingly, training for managers and even an internal policy for managers may be useful in avoiding and/or defending against such claims, as well as other claims relating to discrimination, invasion of privacy, harassment, etc.

On March 6, 2024, New Hampshire’s Governor signed Senate Bill 255, which establishes a consumer data privacy law for the state. The Granite State joins the myriad of state consumer data privacy laws. It is the second state in 2024 to pass a privacy law, following New Jersey. The law shall take effect January 1, 2025.

To whom does the law apply?

The law applies to persons who conduct business in the state or persons who produce products or services targeted to residents of the state that during a year period:

  • Controlled or processed the personal data of not less than 35,000 unique consumers, excluding personal data controlled or processed solely for the purpose of completing a payment transaction; or,
  • Controlled or processed the personal data of not less than 10,000 unique consumers and derived more than 25 percent of their gross revenue from the sale of personal data.

The law excludes certain entities such as non-profit organizations, entities subject to the Gramm-Leach-Bliley Act, and covered entities and business associates under HIPAA.

Who is protected by the law?

The law protects consumers defined as a resident of New Hampshire. However, it does not include an individual acting in a commercial or employment context.

What data is protected by the law?

The law protects personal data defined as any information linked or reasonably linkable to an identified or identifiable individual. Personal data does not include de-identified data or publicly available information. Other exempt categories of data include without limitation personal data collected under the Family Educational Rights and Privacy Act (FERPA), protected health information under HIPAA, and several other categories of health information.

What are the rights of consumers?

Consumers have the right under the law to:

  • Confirm whether or not a controller is processing the consumer’s personal data and accessing such personal data
  • Correct inaccuracies in the consumer’s personal data
  • Delete personal data provided by, or obtained about, the consumer
  • Obtain a copy of the consumer’s personal data processed by the controller
  • Opt-out of the processing of the personal data for purposes of target advertising, the sale of personal data, or profiling in furtherance of solely automated decisions that produce legal or similarly significant effects. Although subject to some exceptions, a “sale” of personal data under the New Hampshire law includes the exchange of personal data for monetary or other valuable consideration by the controller to a third party, language similar to the California Consumer Privacy Act (CCPA).

When consumers seek to exercise these rights, controllers shall respond without undue delay, but no later than 45 days after receipt of the request. The controller may extend the response period by 45 additional days when reasonably necessary. A controller must establish a process for a consumer to appeal the controller’s refusal to take action on a request within a reasonable period of the decision. As with the CCPA, controllers generally may authenticate a request to exercise these rights and are not required to comply with the request if they cannot authenticate, provided they notify the requesting party.

What obligations do controllers have?

Controllers have several obligations under the New Hampshire law. A significant obligation is the requirement to provide a “reasonably accessible, clear and meaningful privacy notice” that meets standards established by the secretary of state and that includes the following content:

  • The categories of personal data processed by the controller;
  • The purpose for processing personal data;
  • How consumers may exercise their consumer rights, including how a consumer may appeal a controller’s decision with regard to the consumer’s request;
  • The categories of personal data that the controller shares with third parties, if any;
  • The categories of third parties, if any, with which the controller shares personal data; and
  • An active electronic mail address or other online mechanism that the consumer may use to contact the controller.

This means that the controller needs to do some due diligence in advance of preparing the notice to understand the nature of the personal information it collects, processes, and maintains.

Controllers also must:

  • Limit the collection of personal data to what is adequate, relevant, and reasonably necessary in relation to the purposes for which such data is processed, as disclosed to the consumer. As with other state data privacy laws, this means that controllers must give some thought to what they are collecting and whether they need to collect it;
  • Not process personal data for purposes that are neither reasonably necessary to, nor compatible with, the disclosed purposes for which such personal data is processed, as disclosed to the consumer unless the controller obtains the consumer’s consent;
  • Establish, implement, and maintain reasonable administrative, technical, and physical data security practices to protect the confidentiality, integrity, and accessibility of personal data appropriate to the volume and nature of the personal data at issue. What is interesting about this requirement, which exists in several other privacy laws, is that this security requirement applies beyond more sensitive personal information, such as social security numbers, financial account numbers, health information, etc.;
  • Not process sensitive data concerning a consumer without obtaining the consumer’s consent, or, in the case of the processing of sensitive data concerning a known child, without processing such data in accordance with COPPA. Sensitive data means personal data that includes data revealing racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sex life, sexual orientation, or citizenship or immigration status; the processing of genetic or biometric data for the purpose of uniquely identifying an individual; personal data collected from a known child; or, precise geolocation data;
  • Not process personal data in violation of the laws of this state and federal laws that prohibit unlawful discrimination against consumers;
  • Provide an effective mechanism for a consumer to revoke the consumer’s consent that is at least as easy as the mechanism by which the consumer provided the consumer’s consent and, upon revocation of such consent, cease to process the data as soon as practicable, but not later than fifteen days after the receipt of such request; and
  • Not process the personal data of a consumer for purposes of targeted advertising, or sell the consumer’s personal data without the consumer’s consent, under circumstances where a controller has actual knowledge, and willfully disregards, that the consumer is at least thirteen years of age but younger than sixteen years of age.  
  • Not discriminate against a consumer for exercising any of the consumer rights contained in the New Hampshire law, including denying goods or services, charging different prices or rates for goods or services, or providing a different level of quality of goods or services to the consumer.

In some cases, such as when a controller processes sensitive personal information as discussed above or for purposes of profiling, it must conduct and document a data protection assessment for those activities. Such assessments are required for the processing of data that presents a heightened risk of harm to a consumer.  

Are controllers required to have agreements with processors?

As with the CCPA and other comprehensive data privacy laws, the law appears to require that a contract between a controller and a processor govern the processor’s data processing procedures with respect to processing performed on behalf of the controller. 

Among other things, the contract must require that the processor:

  • Ensure that each person processing personal data is subject to a duty of confidentiality with respect to the data;
  • At the controller’s direction, delete or return all personal data to the controller as requested at the end of the provision of services, unless retention of the personal data is required by law.
  • Upon the reasonable request of the controller, make available to the controller all information in its possession necessary to demonstrate the processor’s compliance with the obligations in this chapter;
  • After providing the controller an opportunity to object, engage any subcontractor pursuant to a written contract that requires the subcontractor to meet the obligations of the processor with respect to the personal data; and
  • Allow, and cooperate with, reasonable assessments by the controller or the controller’s designated assessor, or the processor may arrange for a qualified and independent assessor to conduct an assessment of the processor’s policies and technical and organizational measures in support of the obligations under the law, using an appropriate and accepted control standard or framework and assessment procedure for such assessments.  The processor shall provide a report of such assessment to the controller upon request.

Other provisions might be appropriate in an agreement between a controller and a processor, such as terms addressing responsibility in the event of a data breach and specific record retention obligations.

How is the law enforced?

The attorney general shall have sole and exclusive authority to enforce a violation of the statute.

If you have questions about New Hampshire’s privacy law or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.

On February 28, 2024, President Biden issued an Executive Order (EO) seeking to protect the sensitive personal data of Americans from potential exploitation by particular countries. The EO acknowledges that access to Americans’ “bulk sensitive personal data” and United States Government-related data by countries of concern can, among other things:

…fuel the creation and refinement of AI and other advanced technologies, thereby improving their ability to exploit the underlying data and exacerbating the national security and foreign policy threats.  In addition, access to some categories of sensitive personal data linked to populations and locations associated with the Federal Government — including the military — regardless of volume, can be used to reveal insights about those populations and locations that threaten national security.  The growing exploitation of Americans’ sensitive personal data threatens the development of an international technology ecosystem that protects our security, privacy, and human rights.

The EO also acknowledges that due to advances in technology, combined with access by countries of concern to large data sets, data that is anonymized, pseudonymized, or de-identified is increasingly able to be re-identified or de-anonymized. This prospect is significantly concerning for health information warranting additional steps to protect health data and human genomic data from threats.

The EO does not specifically define “bulk sensitive personal data” or “countries of concern,” it leaves those definitions to the Attorney General and regulations. However, under the EO, “sensitive personal data” generally refers to elements of data such as covered personal identifiers, geolocation and related sensor data, biometric identifiers, personal health data, personal financial data, or any combination thereof.

Significantly, the EO does not broadly prohibit:

United States persons from conducting commercial transactions, including exchanging financial and other data as part of the sale of commercial goods and services, with entities and individuals located in or subject to the control, direction, or jurisdiction of countries of concern, or impose measures aimed at a broader decoupling of the substantial consumer, economic, scientific, and trade relationships that the United States has with other countries. 

Instead, building on previous executive actions, such as Executive Order 13694 of April 1, 2015 (Blocking the Property of Certain Persons Engaging in Significant Malicious Cyber-Enabled Activities), the EO intends to establish “specific, carefully calibrated actions to minimize the risks associated with access to bulk sensitive personal data and United States Government-related data by countries of concern while minimizing disruption to commercial activity.”

In short, some of what the EO does includes the following:

  • Directs the Attorney General, in coordination with the Department of Homeland Security (DHS), to issue regulations that prohibit or otherwise restrict United States persons from engaging in certain transactions involving bulk sensitive personal data or United States Government-related data, including transactions that pose an unacceptable risk to the national security. Such proposed regulations, to be issued within 180 days of the EO, would identify the prohibited transactions, countries of concern, and covered persons.  
  • Directs the Secretary of Defense, the Secretary of Health and Human Services, the Secretary of Veterans Affairs, and the Director of the National Science Foundation to consider steps, including issuing regulations, guidance, etc. to prohibit the provision of assistance that enables access by countries of concern or covered persons to United States persons’ bulk sensitive personal data, including personal health data and human genomic data.  

At this point, it remains to be seen how this EO might impact certain sensitive personal information or transactions involving the same.

Jackson Lewis will continue to track developments regarding the EO and related issues in data privacy. If you have questions about the Executive Order or related issues contact a Jackson Lewis attorney to discuss.

To celebrate Data Privacy Day (January 28), we present our top ten data privacy and cybersecurity predictions for 2024.

  1. AI regulations to protect data privacy.

Automated decision-making tools, smart cameras, wearables, and similar applications, powered by technology commonly referred to as “artificial intelligence” or “AI” will continue to expand in 2024 as will the regulations to protect individuals’ privacy and secure data when deploying those technologies. Last year, we saw a comprehensive Executive Order from the Biden Administration, the New York City AI law take effect, and states like Connecticut passed laws regarding the state use of AI. Already in 2024, several states have introduced proposed AI regulation, such as  New York developing an AI Bill of Rights.

The use of “generative AI” also exploded, as several industries sought to leverage its benefits while trying to manage risks. In healthcare, for example, AI and HIPAA do not always mix when it comes to maintaining the confidentiality of protected health information. Additionally, generative AI is not only used for good, as criminal threat actors have enhanced their phishing attacks against the healthcare industry.

  1. The continued expansion of the patchwork of state privacy laws.

In 2023, seven states added comprehensive consumer privacy laws. And several other states enacted more limited privacy laws dealing with social media or health-related data. It looks like 2024 will continue the expansion. Already in 2024, New Jersey has passed its own consumer privacy law, which takes effect in 2025. And New Hampshire is not far behind in potentially passing a statute.

  1. Children’s data protections will expand.

In 2023, several states passed or considered data protection legislation for minors with growing concerns that the Children’s Online Privacy Protection Act (COPPA) was not sufficient to protect children’s data. Connecticut added additional protections for minors’ data in 2023.

In 2024, the Federal Trade Commission (FTC) issued a notice of proposed rulemaking pertaining to COPPA, in addition to several states proposing legislation to protect children’s online privacy.

  1. Cybersecurity audits will become even more of a necessity to protect data.

As privacy protection legislation increases, businesses must start working to protect the data they are collecting and maintaining. The importance of conducting cybersecurity audits to ensure that policies and procedures are in place.

In 2023, there California Privacy Protection Agency considered regulations pertaining to cybersecurity audits. The SEC and FTC expanded obligations for reporting security breaches, making audits, incident response planning, and tabletop exercises to avoid such incidents all the more important.

It is anticipated there will be further regulations and legislation forcing companies to consider their cybersecurity in order to protect individuals’ privacy.

  1. Genetic and health data protection will continue to rise.

In 2023, Nevada and Washington passed health data privacy laws to protect data collected that was not subject to HIPAA. Montana passed a genetic information privacy law. Already this year Nebraska is advancing its own genetic information privacy law. It is likely concerns about health and genetic data will grow along with other privacy concerns and so too will the legislation and regulations. We also have seen a significant uptick in class action litigation in Illinois under the state’s Genetic Information Privacy Act (GIPA). A close relative to the state’s Biometric Information Privacy Act (BIPA), GIPA carried nearly identical remedy provisions, except the amounts of statutory damages are higher than under BIPA.

  1. Continued enforcement actions for data security.

As legislation and regulations grow so too will enforcement actions. Many of the state statutes and city regulations only allow for governmental enforcement, however, those entities are going to start enforcing requirements to ensure there is an incentive for businesses to comply. In 2023, we saw the New York Attorney General continue its active enforcement of data security requirements.

  1. HIPAA compliance will continue to be difficult as it overlaps with cybersecurity.

In 2023, the Office of Civil Rights (OCR) which enforces HIPAA, discussed issues with driving cybersecurity and HIPAA compliance as well as other compliance concerns.  In 2024, entities required to comply with HIPAA will be challenged to determine how to use new and useful technologies and data sharing while maintaining privacy, while also protecting HIPAA-covered information as cybersecurity threats continue to flourish.

  1. Website tracking technologies will continue to be in the hot seat.

In 2023, both the FTC and the Health and Human Services (HHS) took issue with website tracking technologies such as through “pixels”. By the time that guidance was issued, litigation concerning these technologies pertaining to data privacy and data sharing concerns had already been expanding. To help clients identify and address these risks Jackson Lewis and SecondSight joined forces to offer organizations a website compliance assessment tool that has been well received.

In 2024, it is anticipated that there will be further website-tracking litigation as well as enforcement actions from governmental agencies that see the technology as infringing on consumers’ privacy rights.

  1. Expect biometric information to increasingly be leveraged to address privacy and security concerns.

As we move toward a “passwordless” society,  technologies using biometric identifiers and information continue to be the “go-to” method for authentication. However, also increasing are the regulations on the collection and use of biometric information. While the Illinois Biometric Information Privacy Act (BIPA) is most prolific in its protection of biometric information, many of the new comprehensive privacy laws include protections for biometric information. See our biometric law map for developments.  

  1. Privacy class actions will continue to increase.

Whether it is BIPA, GIPA, CIPA, TCPA, DPPA, pixel litigation, or data breach class actions, 2024 will likely see an increase in privacy-related class actions. As such, it becomes more important than ever for businesses to understand and ensure the protection of the data they collect and control.

For these reasons and others, we believe data privacy will continue to be at the forefront of many industries in 2024, and Jackson Lewis will continue to track relevant developments. Happy Privacy Day!

The Federal Trade Commission (FTC) has approved an amendment to its Safeguards Rule that will require non-banking financial institutions to report certain data breaches (or “notification events”) to the FTC (not affected individuals).

The “Safeguards Rule,” short for “Standards for Safeguarding Customer Information,” was created to ensure that businesses maintain safeguards to protect the security of customer information. The Safeguards Rule already applied to financial institutions subject to the FTC jurisdiction and that aren’t subject to the enforcement authority of another regulator under the Gramm-Leach-Bliley Act. Under the Rule, financial institutions are defined as any institution the business of which is engaging in an activity that is financial in nature or incidental to such financial activities. FTC guidance can help to better navigate that definition.   

Amendment

While parts of the Safeguards Rule already apply to non-banking financial institutions such as mortgage brokers, motor vehicle dealers, accountants, tax preparation services, and payday lenders, the recent amendment expands the data breach reporting requirements to these entities.

The recent amendment presents a significant expansion of the obligation to provide notification of a “notification event,” even beyond what generally is required under potentially applicable state breach notification laws. Under the FTC’s amendment, the notification obligation applies to “customer information,” whereas most state breach notification laws apply to “personal information.” Remember definitions are important. While states have expanded their definitions of personal information over the years, the term is generally defined to include an individual’s first name (or first initial) and last name, together with one or more of the following data elements:

  • Social security number.
  • Driver’s license number, California identification card number, tax identification number, passport number, military identification number, or other unique identification number issued on a government document commonly used to verify the identity of a specific individual.
  • Account number or credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account.
  • Medical information.
  • Health insurance information.
  • Unique biometric data generated from measurements or technical analysis of human body characteristics, such as a fingerprint, retina, or iris image, is used to authenticate a specific individual. Unique biometric data does not include a physical or digital photograph, unless used or stored for facial recognition purposes.
  • Information or data collected through the use or operation of an automated license plate recognition system, as defined in Section 1798.90.5.
  • Genetic data.

The above definition is taken from California’s breach notification law that applies to certain businesses and is one of the most expansive. It also includes a username or email address, in combination with a password or security question and answer that would permit access to an online account. However, many other states include only a portion of these elements, often only those in the first three bullets above.

On the other hand, customer information is nonpublic, personally identifiable financial information maintained about a “customer.” For this purpose, a customer is a consumer with whom the financial institution has a continuing relationship to provide financial products or services for personal, family, or household purposes. In its final rule, the FTC describes customer information as follows:

The definition of “customer information” in the Rule does not encompass all information that a financial institution has about consumers. “Customer information” is defined as records containing “non-public personal information” about a customer. “Non-public personal information” is, in turn, defined as “personally identifiable financial information,” and excludes information that is publicly available or not “personally identifiable.” The Commission believes that security events that trigger the notification requirement—where customers’ non-public personally identifiable, unencrypted financial information has been acquired without authorization—are serious and support the need for Commission notification.

This definition is not limited to a specific set of data elements like Social Security numbers or financial account numbers. Also, while many state laws limit the definition of personal information to computerized data, FTC guidance provides that customer information includes “any record containing nonpublic personal information about a customer of a financial institution, whether in paper, electronic, or other form, that is handled or maintained by or on behalf of you or your affiliates.”

Under the amendment, non-banking financial institutions must report “notification events” in which the data of at least 500 people has been acquired without authorization as soon as possible, and no later than 30 days after the discovery to the FTC. A few other points about the rule:

  • Notification events are defined as unauthorized acquisitions of customer information, while several state breach notification laws include unauthorized access to personal information.
  • As noted above, the final rule does not require notification to affected individuals. However, like many states, notably Maine, the FTC will publish information about the notification events it receives.
  • The FTC’s final rule does not include a risk of harm exception, which is a provision in state laws. Such provisions can be welcomed relief to businesses as they provide that even if there is a “breach” as defined under the law, notice is not required if, generally speaking, there is not a significant risk of harm to affected individuals.    

The breach notification requirement becomes effective 180 days after publication of the rule in the Federal Register. 

If you have questions about data breach reporting or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.