UPDATE: On June 16, Gov. Ned Lamont signed HB 5310 into law which becomes effective October 1, 2021.

State legislatures across the nation are prioritizing privacy and security matters, and Connecticut is no exception. This week, Connecticut Attorney General William Tong announced the passage of An Act Concerning Data Privacy Breaches, a measure that will enhance and strengthen Connecticut’s data breach notification law. The Connecticut House of Representatives unanimously approved the bill on May 27th, and Senate followed with unanimous approval shortly after.  The bill now heads to Governor Ned Lamont for signage.

Connecticut has led the nation in data privacy for over a decade, and this legislation ensures that we will continue to do so. Since we passed one of our nation’s first laws protecting consumers from online data breaches, technology and risks have evolved. This legislation ensures that our laws reflect those evolving risks and continue to offer strong, comprehensive protection for Connecticut residents,

Attorney General Tong observed in his announcement of the data breach notification bill.

Key aspects of Connecticut’s enhanced data breach notification law include:

  • Expansion of the definition of “personal information.

Originally, Connecticut defined “personal information” as an individual’s first name or first initial and last name in combination with any one, or more, of the following data:

    • Social security number
    • Driver’s license number
    • State identification card number
    • Credit or debit card number
    • Financial account number in combination with any required security code, access code, or password that would permit access to such financial account.

The new law if enacted will look more like similar laws in California and Florida by including additional data categories:

    • Individual taxpayer identification number
    • Identity protection personal identification number issued by the IRS
    • Passport number, military identification number or other identification number issued by the government that is used to verify identity
    • Medical information regarding an individual’s medical history, mental or physical condition or medical treatment or diagnosis by a healthcare professional
    • Health insurance policy number or subscriber identification number, or any unique identifier by a health insurer to identify the individual
    • Biometric information consisting of data generated by electronic measurements of an individual’s unique physical characteristics and used to authenticate or ascertain the individual’s identity, such as a fingerprint, voice print, retina or iris image; and
    • User name or electronic mail address, in combination with a password or security question and answer that would permit access to an online account.
  • Notification Time and Content.

The new law would shorten the time a business has to notify affected Connecticut residents and the Office of the Attorney General of a data breach time from 90 days to 60 days. Remember, as with most other breach notification mandates, the timing requirement is “without unreasonable delay but not later than 60 days” in this case. In addition, if identification of a resident of the state whose personal information was breached or reasonably believed to have been breached will not be completed within 60 days, the business must provide preliminary substitute notice as outlined by the law, and proceed in good faith to work to identify affected residents and provide direct notice as expediently as possible. Incident response plans would need to be reviewed to ensure this requirement is incorporated.

  • Breach of Login Credential. 

The new law would add a section addressing unique notification requirements in the case of a breach of login credentials. In such a case, notice to an affected resident may be provided in electronic or other form that directs the resident to promptly change any password or security questions and answers, or to take other appropriate steps to protect the affected online account, or any account with the same login credentials.

  • HIPAA and HITECH Act Exception.

Any person subject to and in compliance with HIPAA and/or the HITECH Act privacy and security obligations is deemed in compliance of the new law with a couple of critical exceptions. First, as under New York’s SHIELD Act, a person subject to HIPAA or HITECH that is required to notify Connecticut residents of a data breach under HITECH still must notify Connecticut’s Attorney General at the same time residents are notified. Second,  if the person would have been required to provide identity theft prevention and/or mitigation services under Connecticut law, which is for a period of 24 months, that requirement remains.

  • Investigation Materials.

Under the new law, documents, material and information connected to the investigation of a breach of security would be exempt from public disclosure, unless required to be made available to third parties by the Attorney General in furtherance of the investigation.

This new law, if signed keeps Connecticut in line with other states across the nation currently enhancing their data breach notification laws in light of recent large-scale data breaches and heightened public awareness.  Organizations across the United States should be evaluating and enhancing their data breach prevention and response capabilities.

Below are several resources for understanding current trends in the state data breach notification law landscape:

On May 13th, New York State Senator Kevin Thomas, Chair of NY’s Consumer Protection Committee, reintroduced the New York Privacy Act (“NYPA”), a comprehensive consumer privacy law similar in kind to the California Consumer Privacy Act (“CCPA”), California Privacy Rights Act (“CPRA”), and Virginia’s Consumer Data Protection Act (“CDPA”).  The NYPA had been introduced in a previous legislative session back in 2019, but failed to move forward in the legislative process.

This version of the NYPA is in some respects less ambitious than the prior version.  For example, the latest version removed the bill’s broad application to any “legal entities that conduct business in New York” or that produce products or services that “intentionally target” New York residents, which would have meant that small-to-medium size businesses, and potentially even not-for-profits, would have been subject to the law. Nevertheless the NYPA surpasses the CCPA and CDPA in some important respects, including by requiring data controllers to:

  • collect opt-in consent from consumers before processing their personal data for any purpose;
  • provide detailed disclosures about the activities of outside parties to whom they disclose personal data;
  • respond to consumer requests to correct personal data; and
  • make disclosures about their automated decision-making activities, afford consumers the opportunity to challenge automated decisions, and conduct and publish assessments on the impacts of their automated decision-making processes.

The NYPA would also impose on data controllers duties of loyalty and care – the latter of which would require an annual risk assessment of all of the data controller’s data processing activities – and take direct aim at targeted advertising and data sales, declaring that these activities “shall not be considered processing purposes that are necessary to provide services or goods requested by a consumer.”

“Consumers should have a right to choose if and how their personal information is collected and used by companies,” said Senator Thomas in his reintroduction of the NYPA. “And New Yorkers deserve to know that businesses who are collecting, processing and protecting their personally identifiable information are doing so ethically and responsibly. The New York Privacy Act will set new, groundbreaking standards for comprehensive privacy legislation by advancing consumer privacy rights and creating stronger industry standards that empower businesses to enhance consumer confidence by putting privacy and security front-and-center.”

Below is a rundown of the NYPA’s key components:

  • Application: The NYPA would apply to legal persons that conduct business in New York State or produce products or services intentionally targeted to residents in New York State and that satisfy at least one of the following thresholds:
    • have annual gross revenue of $25M or more;
    • control or process personal data of at least 100,000 New York residents;
    • control or process personal data of at least 500,000 persons nationwide, at least 10,000 of whom are New York residents; or
    • derives over 50% of its gross revenue from the sale of personal data, and controls or processes personal data of at least 25,000 New York residents.
    • Exempt: Exempted from the NYPA are state and local governments, and personal data that is regulated by HIPAA, HITECH, FERPA, DPPA, GLBA and notably, “data sets maintained for employment records purposes, for purposes other than sale”.
  • Personal Data: Similar to the CCPA and CDPA, the NYPA defines personal data broadly to include “any data that is identified or could reasonably be linked, directly or indirectly, with a specific natural person, household, or device”. That said, unlike the CPRA,  CDPA or GDPR, the New York bill does not include a category for “sensitive data” to which heightened protections apply.
  • Consumer: The NYPA defines “consumer” as “a natural person who is a resident of New York acting only in an individual or household context.” The NYPA states that the definition of consumer does not include a “natural person acting in a commercial or employment context.”
  • Consumer Rights: The NYPA provides consumers a broad set of rights over their personal data, including the rights to:
    • receive clear notice of how their data is being used, processed and shared;
    • provide or withhold consent for the processing of their data for any purpose;
    • access and obtain a copy of their data in a commonly used electronic format, with the ability to transfer it between services;
    • correct inaccuracies in their data;
    • delete their data; and
    • challenge certain automated decisions.
  • Notice to Consumers: Under the NYPA, data controllers must provide written notice to consumers when processing their personal data in an “easy-to-understand language at an eighth-grade reading level or below.” This notice must include a description of the consumers’ rights, the categories of personal data processed, the sources of that data, the purposes for which the data is processed, and the identities of all outside parties to whom the data is disclosed, as well as information about how those parties will use the data and how long they will retain it. The notice must be dated with its effective date and updated at least annually. The notice (as well as each version of the notice dating back six years) must be made readily available to consumers
  • Non-Discrimination: The NYPA prohibits discrimination against a consumer who exercises their rights under the law. For example, a business may not target the consumer by denying goods or services or charging a higher price.
  • Data Broker Registry: The NYPA requires data brokers to register, pay an annual fee to the Attorney General, and submit information regarding their data use practices and contact information. The Attorney General must maintain a data broker registry on its website. Additionally, controllers must annually submit a list of all known data brokers or persons reasonably believed to be data brokers with whom the controller provided personal data in the preceding year and can only share personal data with data brokers that are properly registered.
  • Data Security: At least annually, under the NYPA, data controllers are required to conduct and document risk assessments of all current processing of personal data. In addition, data controllers must develop, implement, and maintain reasonable safeguards to protect the security, confidentiality and integrity of the personal data of consumers including adopting reasonable administrative, technical and physical safeguards appropriate to the volume and nature of the personal data at issue. The NYPA also imposes requirements related to data retention, data disposal and vendor management.
  • Enforcement and Private Right of Action: The NYPA authorizes the Attorney General to bring an action or special proceeding whenever it appears that a person has engaged or is about to engage in a violation of the law, with civil penalties of not more than $15,000 per violation (each instance of unlawful processing counts as a separate violation). And unlike comparable state laws, the NYPA would grant consumers a private right of action to enjoin violations of their rights under the law and to seek the greater of actual damages or liquidated damages in the amount of $1,000, along with attorney’s fees.  Contrary to other state consumer privacy bills introduced of late, such as Florida’s recently failed HB 969 or New York’s Biometric Privacy law, an organization found to have violated the NYPA does not have the opportunity to cure the violation before facing enforcement actions or litigation.

States across the country are contemplating ways to enhance their data privacy and security protections, with New York playing a leading role.  In addition to the reintroduction of the NYPA, there are other consumer privacy bills under consideration by the New York state legislature, and the New York City Council recently passed a data privacy bill that would impose rigorous requirements on owners of “Smart Access” buildings, and also created biometric information collection requirements for retail and hospitality businesses similar in kind to Illinois’s infamous Biometric Information Privacy Act (“BIPA”). Organizations, regardless of their location, should be assessing and reviewing their data collection activities, building robust data protection programs, and investing in written information security programs.

 

As we noted in our last post, there has been a flurry of data privacy and security activity in New York, with the State appearing poised to join California as a leader in this space.  Most recently, on April 29, 2021, the New York City Council passed the Tenant Data Privacy Act (“TDPA”), which would impose on owners of “smart access” buildings obligations related to their collection, use, safeguarding, and retention of tenant data.

Under the TDPA, a “smart access” building is one that uses electronic or computerized technology (e.g., a key fob), radio frequency identification cards, mobile phone applications, biometric information (e.g., fingerprints, voiceprints, hand or face geometry), or other digital technology to grant entry to the building, or to common areas or individual dwelling units therein.  The TDPA would require owners of smart access buildings to develop and maintain policies and procedures to address the following requirements:

  1. Express Consent. Before collecting “reference data” from a tenant for use in connection with the building’s smart access system, the building owner would be required to obtain the tenant’s express consent “in writing or through a mobile application.”  “Reference data” is the data used by the system to verify that the individual seeking access is authorized to enter.  Even after obtaining consent, the owner would only be permitted to collect the minimum amount of data necessary to enable the smart access system to function effectively.
  2. Privacy Policy. Building owners would also need to provide a “plain language” privacy policy to its tenants that includes certain disclosures, including disclosure of the data elements that the system collects, the third parties that data is shared with, how the data is protected, and how long it will be retained.
  3. Stringent Security Safeguards. Additionally, the TDPA would require building owners to implement robust security measures and safeguards to protect the data of its tenants, guests, and other users of the smart access system.  At a minimum, these security measures would need to include data encryption, a password reset capability (if the system uses a password), and regularly updated firmware to address security vulnerabilities.
  4. Data Destruction. With limited exceptions, building owners would need to destroy any “authentication data” collected through their smart access systems no later than 90 days after collection.  “Authentication data” is the data collected from the user at the point of authentication, excluding any data generated through or collected by a video or camera system used to monitor entrances, but not to grant entry.

The TDPA would impose strict limits on the categories of tenant data that building owners would be permitted to collect, generate, or utilize through their smart access systems.  Specifically, they would only be permitted to collect:

  • the user’s name;
  • the dwelling unit number and that of other doors or common areas to which the user has access;
  • the user’s preferred method of contact;
  • the user’s biometric identifier information (if the smart access system utilizes such information);
  • the identification card number or any identifier associated with the physical hardware used to facilitate building entry (e.g., Bluetooth);
  • passwords, passcodes, usernames and contact information used singly or in conjunction with other reference data to grant the user access;
  • lease information, including move-in and, if available, move-out dates; and
  • the time and method of access (but solely for security purposes).

Building owners would also be prohibited, subject to certain exceptions, from selling, leasing, or otherwise disclosing tenant data to any third parties.  Building owners that wish to engage third-party vendors to operate or facilitate use of their smart access systems would be required to first (a) provide to users the name of the vendor, the intended use of user data by the vendor, and a copy of the vendor’s privacy, and (b) obtain the users’ express written authorization to disclose the users’ data to the vendor.

Significantly, the TDPA would also create a private right of action for tenants whose data is unlawfully sold.  Such tenants would be empowered to seek either compensatory damages or statutory damages ranging from $200 to $1,000 per tenant, along with attorneys’ fees.

Unless vetoed by the City’s Mayor, the TDPA will take effect at the end of June 2021, though building owners will be granted a grace period until January 1, 2023, to develop their compliance programs and replace or upgrade their smart access systems.  Building owners should use that time wisely, as the TDPA’s requirements will, in many instances, be a heavy lift.

The California Privacy Protection Act (CPRA) amended the California Consumer Privacy Act (CCPA) and has an operative date of January 1, 2023. The CPRA introduces new compliance obligations including a requirement that businesses conduct risk assessments. While many U.S. companies currently conduct risk assessments for compliance with state “reasonable safeguards” statutes (e.g., Florida, Texas, Illinois, Massachusetts, New York) or the HIPAA Security Rule, the CPRA risk assessment has a different focus. This risk assessment requirement is similar to the EU General Data Protection’s (GDPR) data protection impact assessment (DPIA).

The goal of conducting a CPRA risk assessment is to restrict or prohibit the processing of personal information where the risks to a consumer’s privacy outweigh any benefits to the consumer, business, stakeholders, and public. Notably, the CPRA does not limit risk assessments to activities involving the processing of sensitive data. In addition to conducting the actual risk assessment, this process will require a preliminary determination of which data processing activities may present a significant risk to privacy rights. The business must document these risk assessments for submission to the California Privacy Protection Agency on a regular basis.

Under the CPRA, the documented risk assessment shall:

  • include whether the processing involves consumers’ sensitive personal information (e.g., social security, driver’s license, state identification card, or passport number; account log-in, financial account, debit card, or credit card number in combination with security or access code, password, or credentials for account; precise geolocation; racial or ethnic origin, religious or philosophical beliefs, or union membership; contents of mail, email, and text messages unless the business is the intended recipient of the communication; genetic data; biometric information processed for the purpose of uniquely identifying a consumer; information related to health, sex life or orientation); and
  • identify and weigh the benefits to the business, consumer, other stakeholders, and the public from the processing against the potential risks to the rights of the consumer whose data is being processed.

The CPRA directs the California Attorney General and California Privacy Protection Agency to issue implementing regulations, including regulations related to risk assessments. These regulations must be adopted by July 1, 2022 and will likely provide further guidance on the scope of and process for conducting and documenting risk assessments.

Complying with the CPRA will require expanded data mapping and advance planning, some of which may occur prior to issuance of the implementing regulations. During this time, businesses may find the GDPR instructive, particularly since the CCPA and CPRA borrow liberally from the regulation.

Under the GDPR and related guidelines, a DPIA is required or recommended where data processing is likely to result in a high risk to the privacy rights of individuals. This includes activities that

  • use automated processing, including profiling, to evaluate an individual’s personal aspects and on which decisions are based that produce significant effects
  • include large scale processing of sensitive data
  • process data on a large scale
  • match or combine datasets
  • process data of vulnerable individuals (e.g., children)
  • innovate or use new technologies

The DPIA must document and include

  • a description of the processing operations
  • the purposes of the processing
  • the legitimate interest pursued by the business, where applicable
  • an assessment of the necessity and proportionality of the processing activity in relation to the purposes
  • an assessment of the risks to the individual’s privacy rights
  • measures designed to address the risks, including safeguards, security measures and mechanisms to ensure the protection of personal data

The CCPA and CPRA currently exclude employee personal information from certain provisions (e.g., the right to opt out, right to delete). This carve-out exempts employee personal information from the risk assessment requirement outlined above; however, the carve-out is due to expire on January 1, 2023. As businesses begin developing their risk assessment programs, they will want to monitor whether this exclusion for employee information will be extended and/or amended and how it might impact the risk assessment process.

As noted above, the operative date of the CPRA is January 1, 2023. Implementing regulations must be adopted by July 1, 2022 and civil and administrative enforcement activity can commence on July 1, 2023.

For additional information on the CPRA, please reach out to a member of our Privacy, Data and Cybersecurity practice group or check out our CPRA blog series.

How To Do a Colorado DMV Change of Address | Moving.comColorado recently became the latest state to consider a comprehensive consumer privacy law.  On March 19, 2021, Colorado State Senators Rodriguez and Lundeen introduced SB 21-190, entitled “an Act Concerning additional protection of data relating to personal privacy”. Following California’s bold example of the California Consumer Privacy Act (“CCPA”) effective since January 2020, Virginia recently passed its own robust privacy law, the Consumer Data Protection Act (“CDPA”), and New York, as well as other states, like Florida, appear poised to follow suit.  Furthermore, California is expanding protections provided by the CCPA, with the California Privacy Rights Act (CPRA) – approved by California voters under Proposition 24 in the November election.

Unsurprisingly, Colorado’s SB 21-190 generally tracks the CCPA, CPDA, CPRA and the EU General Data Protection Regulation (GDPR).  Key elements of the Colorado bill include:

  • Jurisdictional Scope. SB 21-190 would apply to legal entities that conduct business or produce products or services that are intentionally targeted to Colorado residents and that either:
    • Control or process personal data of more than 100,000 consumers per calendar year; or
    • Derive revenue from the sale of personal data and control or process the personal data of at least 25,000 consumers.
  • Exemptions. SB 21-190 includes various exemptions related to healthcare entities and health data, such as protected health information under HIPAA, patient identifying information maintains by certain substance abuse treatment facilities, and identifiable private information collected in connection with human subject research. Additional exemptions include without limitation personal data collected for the purposes of the Gramm Leach Bliley Act (GLBA), Driver’s Privacy Protection Act (DPPA), Children’s Online Privacy Protection Act (COPPA), Family Educational Rights Act and Privacy Act. Finally, data maintained for employment records purposes are exempted as well.
  • Personal Data. Similar to its counterparts, Colorado’s SB 21-190 broadly defines personal data to mean “information that is linked or reasonably linkable to an identified or identifiable individual.”
  • Sensitive Data. Like the CPDA, CPRA and GDPR, SB 21-190 includes a category for “sensitive data”. This is defined as “personal data revealing racial or ethical origin, religious beliefs, a mental or physical health condition or diagnosis, sex life or sexual orientation, or citizenship or citizenship status OR genetic or biometric data that may be processed for the purpose of uniquely identifying an individual OR personal data from a known child”. As with Virginia’s CPDA, there are two key compliance obligations related to “sensitive data”.  First, sensitive data cannot be processed without obtaining consumer consent, or in the case of a known child or student, without obtaining consent from a parent or lawful guardian.  Second, the controller must conduct and document a data protection assessment specifically for the processing of sensitive data.
  • Protected Persons. SB 21-190 defines “consumer” as an “individual who is a Colorado resident acting only in an individual or household context”. The Colorado bill states that the definition of consumer does not include “an individual acting in a commercial or employment context”.
  • Consumer Rights. Under SB 21-190, consumers have the right to opt out of the processing of their personal data; access, correct, or delete the data; or obtain a portable copy of the data.
  • Data Protection Assessments. Akin to Virginia’s CPDA, the Colorado bill requires data controllers to conduct a data protection assessment for each of their processing activities involving personal data that presents a heightened risk of harm to consumers, such as processing for purposes of targeted advertising or processing sensitive data (as mentioned above).
  • Enforcement. If enacted, SB 21-190 would only be enforceable by the Colorado attorney general or district attorneys. A violation of law could result in a civil penalty of not more than $2,000 for each such violation (not to exceed $500,000 for any related series of violations), or injunction.

Colorado’s SB 21-190 is in the early stages of the legislative process, still it signals the continued momentum building in states across the country to enhance consumer data privacy and security protections. Organizations, regardless of their location, should be carefully assessing their data collection activities, developing policies and procedures to address their evolving compliance obligations and data-related risks, and training their workforce on effective implementation of those policies and procedures.

The Illinois Supreme Court recently agreed to hear an appeal of an Appellate Court’s decision addressing whether an employee’s claim for damages under Illinois’s Biometric Information Protection Act is preempted by the exclusivity provisions of the Illinois Workers’ Compensation Act (“IWCA”). Back in September, the Illinois Appellate Court for the First Judicial District held that employees’ BIPA claims were not preempted under the Illinois Workers’ Compensation (IWCA) and could go forward.

The BIPA requires companies that collect and use biometric information to establish a policy and obtain a written release prior to collecting such data. Under the BIPA, individuals may sue for violations and, if successful, can recover liquidated damages ranging from $1,000 (or actual damages, whichever is greater) for negligent violations to $5,000 for intentional or reckless violations — plus attorneys’ fees and costs.

Over the past few years there has been a significant number of lawsuits under the BIPA, particularly after the Illinois Supreme Court held in 2019, in Rosenbach v. Six Flags,  that individuals need not allege actual injury or adverse effect, beyond a violation of his/her rights under BIPA, in order to qualify as an “aggrieved” person and be entitled to seek liquidated damages, attorneys’ fees and costs, and injunctive relief under the Act. A key defense for employers defending BIPA lawsuits has been that the BIPA is preempted by the IWCA.

The plaintiff in Illinois Supreme Court’s most recent case alleged that that their employer violated BIPA by requiring that employees use a fingerprint time clock system without properly: (1) informing the employees in advance and in writing of the specific purpose and length of time for which their fingerprints were being collected, stored, and used; (2) providing a publicly available retention schedule and guidelines for permanently destroying the scanned fingerprints; and (3) obtaining a written release from the employees prior to the collection of their fingerprints.  The employer moved to dismiss the complaint based on several arguments, including the assertion that the plaintiff’s claims would be barred by the exclusivity provisions of the IWCA.  The trial court denied the motion the dismiss, but certified the question for appeal regarding whether the IWCA exclusivity provisions bar a claim for statutory damages under the BIPA.

In September of 2020, the Appellate Court emphasized that the IWCA generally provides the exclusive means by which an employee can recover against an employer for a work-related injury, however an employee can escape the exclusivity provisions of the IWCA if the employee establishes that the injury: 1) was not accidental, 2) did not arise from their employment, 3) was not received during the course of employment or 4) was not compensable under the IWCA.  Focusing on the fourth exception, the Appellate Court concluded that a BIPA claim limited to statutory damages is not an injury compensable under the IWCA, and thus the plaintiff’s claims qualified under the fourth exception and were not preempted by the IWCA.

The Appellate Court, relying on Rosenbach, highlighted that because actual harm is not required under the BIPA to maintain a statutory damages claim, it does not,

“[f]it within the purview of the Compensation Act, which is a remedial statute designed to provide financial protection for workers that have sustained an actual injury.”

The Illinois Supreme Court has now granted leave to appeal the Appellate Court’s ruling, addressing the issue of whether injuries resulting from BIPA claims fall under the scope of the IWCA. While there is no telling how the Supreme Court will ultimately rule, it certainly leaves open the possibility that the Court’s decision will help reign in the significant number of lawsuits, including putative class actions, filed under the BIPA.

If they have not already done so, companies should immediately take steps to comply with the statute. That is, they should review their time management, point of purchase, physical security, or other systems that obtain, use, or disclose biometric information (any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry used to identify an individual) against the requirements under the BIPA. In the event they find technical or procedural gaps in compliance – such as not providing written notice, obtaining a release from the subject of the biometric information, obtaining consent to provide biometric information to a third party, or maintaining a policy and guidelines for the retention and destruction of biometric information – they need to quickly remedy those gaps.  For additional information on complying with the BIPA, please see our BIPA FAQs.

Virginia may be the first state to follow California’s lead on consumer privacy legislation, but it certainly will not be the last. The International Association of Privacy Professionals (IAPP) observed, “State-Level momentum for comprehensive privacy bills is at an all-time high.” The IAPP maintains a map of state consumer privacy legislative activity, with in-depth analysis comparing key provisions. We discuss the Virginia legislation here, along with legislative activity in several other states that seem likely to pass. It was California that enacted the first data breach notification law which became effective in 2003. In about 15 years’ time, all U.S. states have such a law, as well as many jurisdictions around the world.

Whether it is the pending Virginia Consumer Data Protection Act (VCDPA), the California Consumer Privacy Act (CCPA), or a similar framework, there are several features that should be considered when examining the effects of such laws on an organization:

  • Does the law apply? Neither the CCPA nor the VCDPA apply to all organizations doing business in the state. But, they may apply more broadly than initially assumed, including organizations without locations in the particular state. Also, some entities that control or are controlled by covered businesses also could become subject to one of these laws even if such entities would not otherwise fall into the law’s scope. Finally, data privacy and security laws increasingly reach third-party service providers to covered organizations either directly or indirectly through contracts that covered organizations must put in place.
  • Are we exempt? Perhaps just as important as whether an organization is covered by one of these laws is the question of whether an exemption applies. It is important to know that while an organization may not be exempt as a whole, certain classifications data it maintains may be. For example, under the CCPA, “protected health information” covered by the Health Insurance Portability and Accountability Act (HIPAA) is generally exempt from the law. Of course, that information comes with its own compliance obligations!
  • What is Personal Information? Assuming an organization is covered by the law, the next question it may want to ask is what data is covered. As we have discussed, there are various definitions and understandings of personal information.  Similar to the CCPA and General Data Protection Regulation (GDPR), the VCDPA would define personal data broadly to include “any information that is linked or reasonably linkable to an identified or identifiable natural person.” Again, this broad definition should be read together with potential exemptions to obtain a firm understanding of the information within the scope of the law’s protections. In some cases, such as under the GDPR, and the amendment to the CCPA, the California Privacy Rights Act, there is a subset of personal information that comes with even more protections. Often referred to as “sensitive personal information,” this category can include personally identifiable information such as racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, citizenship or immigration status, genetic or biometric data, and geolocation data. Of course, covered organizations with these categories of data would need to understand those additional requirements.
  • Who is protected? It is not enough to know what kind of information that is “personal information,” covered organizations also need to know whose personal information is protected under the law. Several of these laws protect “consumers” defined generally as natural persons who reside in the jurisdiction. Basing the analysis solely on the word “consumer” and assuming that does not include employees, students, website visitors, etc. might be a mistake. Some frameworks have specific exclusions for these and other categories, others do not.
  • What rights do protected persons have? Ostensibly, a key purpose for this kind of privacy legislation is to empower individuals with respect to their personal information. That is, to give them more access to and control over their data that is collected, used, disclosed, maintained, and sold . To effectively comply with these measures, covered organizations need to understand the kinds of rights granted. These rights can include:
    • The right to know what personal information is collected and processed, why, and to access such personal information
    • To right to correct inaccuracies in the personal information
    • To right to delete personal information
    • The right to limit processing of personal information
    • The right to opt out of the processing or sale of personal information
  • Can my organization be sued for violations of the law? It is important to understand the consequences of failing to comply with any law. The flood of litigation under the Illinois Biometric Information Privacy Act (BIPA) which permits substantial recovery for failing to comply with notice and other requirements, even without a showing of actual harm, confirms the importance of examining this issue. Several of these privacy frameworks, including the CCPA and legislation supported by Governor DeSantis in Florida, include a private right of action in connection with data breaches.
  • How will the law be enforce? Related to the question of whether consumers can sue for violations is how the law will be enforced, what are the potential penalties, and how are they measured. In most cases, enforcement rests with the state’s Attorney General’s office. Often, the law requires covered organizations be provided written notice of any violation and a period of time to cure the violation. Compliance can be challenging so covered organizations should be aware of a law’s enforcement scheme so that in cases where their compliance efforts may not be perfect, they have a plan in place for quickly acting on such notices and curing any violations.

Answering these questions is certainly not the end of the analysis. For example, if covered, there are a whole host of additional questions organizations need to ask in order to evaluate compliance needs, allocate resources, identify affected business units, weigh risk management objectives, manage vendor compliance, and implement new policies and procedures, as needed. However, these questions can help to sharpen the big picture on the effect one or more of these privacy laws may have on your organization.

 

The California Privacy Rights Act (CPRA), passed in November, 2020, added to the California Consumer Privacy Act (CCPA) an express obligation for covered businesses to adopt reasonable security safeguards to protect personal information. The CPRA also clarified the CCPA’s private right of action for consumers whose personal information is breached due to a failure to implement such safeguards. But, remember, reasonable security safeguards are already required under California law, and that requirement is not limited to businesses subject to the CCPA/CPRA.

The CPRA adds subsection (e) to Cal. Civ. Code 1798.100, as follows:

A business that collects a consumer’s personal information shall implement reasonable security procedures and practices appropriate to the nature of the personal information to protect the personal information from unauthorized or illegal access, destruction, use, modification, or disclosure in accordance with Section 1798.81.5.

California Civil Code section 1798.81.5 requires a business that:

owns, licenses, or maintains personal information about a California resident shall implement and maintain reasonable security procedures and practices appropriate to the nature of the information, to protect the personal information from unauthorized access, destruction, use, modification, or disclosure.

Unlike the CCPA/CPRA, section 1798.81.5 defines “business” more broadly to include “a sole proprietorship, partnership, corporation, association, or other group, however organized and whether or not organized to operate at a profit.” Thus, even if the CCPA, as amended by the CPRA, does not apply to your business, California law still may require the business to have reasonable security safeguards.

The meaning of “reasonable safeguards” is not entirely clear in California.  One place to look, however, is in the California Data Breach Report former California Attorney General and now Vice President, Kamala D. Harris, issued in February, 2016. According to that report, an organization’s failure to implement all of the 20 controls set forth in the Center for Internet Security’s Critical Security Controls constitutes a lack of reasonable security.

So, although the CPRA generally is operative on January 1, 2023, California businesses might look to the 20 CIS controls at least as a starting point for securing personal information. With regard to which personal information to secure to minimize exposure under the CCPA/CPRA’s private right of action, the law is a bit more clear.

The CCPA extended the private right of action for data breaches only to personal information “defined in subparagraph (A) of paragraph (1) of subdivision (d) of Section 1798.81.5”:

(A)  An individual’s first name or first initial and the individual’s last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted or redacted:

(i) Social security number.

(ii) Driver’s license number, California identification card number, tax identification number, passport number, military identification number, or other unique identification number issued on a government document commonly used to verify the identity of a specific individual.

(iii) Account number or credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account.

(iv) Medical information.

(v) Health insurance information.

(vi) Unique biometric data generated from measurements or technical analysis of human body characteristics, such as a fingerprint, retina, or iris image, used to authenticate a specific individual. Unique biometric data does not include a physical or digital photograph, unless used or stored for facial recognition purposes.

The CPRA added to this list, a consumer’s “email address in combination with a password or security question and answer that would permit access to the account.”

In the event a CCPA-covered business experiences a data breach involving personal information, the CCPA authorized a private cause of action against the business if a failure to implement reasonable security safeguards caused the breach. If successful, a plaintiff can seek to recover statutory damages in an amount not less than $100 and not greater than $750 per consumer per incident or actual damages, whichever is greater, as well as injunctive or declaratory relief and any other relief the court deems proper. This means that plaintiffs generally do not have to show actual harm to recover. In case you were wondering, CCPA data breach litigation has already commenced.

To bring such an action under the CCPA, a consumer must provide the business 30 days’ written notice specifying the violation and giving the business an opportunity to cure. If cured under the CCPA, no action may be initiated against the business for statutory damages. However, the CPRA clarifies that businesses cannot cure a failure to have reasonable safeguards before the breach:

implementation and maintenance of reasonable security procedures and practices pursuant to Section 1798.81.5 following a breach does not constitute a cure with respect to that breach.

The CPRA also calls for additional regulations requiring businesses whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security, to (i) perform a cybersecurity audit on an annual basis, and (ii) submit to the California Privacy Protection Agency on a regular basis a risk assessment concerning the processing of personal information.

There is more to come following the passage of the CPRA, and businesses should be monitoring CCPA/CPRA developments. However, it is critical to ensure reasonable security safeguards are in place to protect personal information.

Enacted in 2008, the Illinois Biometric Information Privacy Act, 740 ILCS 14 et seq. (the “BIPA”), went largely unnoticed until a few years ago when a handful of cases sparked a flood of class action litigation over the collection, use, storage, and disclosure of biometric information. Seeing thousands of class action lawsuits, organizations have reevaluated and redoubled their compliance efforts. On January 28, 2021, a complaint was filed in Cook County, IL, Melvin v. Sequencing, LLC, alleging violations of the Illinois Genetic Information Privacy Act, 410 ILCS 513/1 – the “GIPA”…try not to get confused… which was originally effective in 1998.

Will the GIPA follow the BIPA?

The GIPA creates a private right of action using the same language as the BIPA:

Any person aggrieved by a violation of this Act shall have a right of action in a State circuit court or as a supplemental claim in a federal district court against an offending party.

However, while the BIPA provides for liquidated damages of $1,000 for each negligent violation and $5,000 for each intentional or reckless violation (or actual damages, if greater), the liquidated damages provisions under the GIPA are significantly higher: $2,500 and $15,000, respectively. If the holding of the Illinois Supreme Court in Rosenbach v. Six Flags Entertainment Corp., No. 123186 (Ill. Jan. 25, 2019) with regard to the BIPA is applied to the GIPA, plaintiffs could potentially maintain a cause of action and seek liquidated damages resulting from alleged violations of the GIPA, without any showing of actual injury beyond his or her rights under the Act.

Of note, in Sekura v. Krishna Schaumburg Tan, Inc., 2018 IL App (1st 180175), the Illinois Appellate Court for the First Judicial District noted, in a pre-Rosenbach BIPA case, that the GIPA “provide[s] for a substantially identical, ‘any person aggrieved’ right of recovery” as the BIPA.  The First District noted that the GIPA was considered and amended during the same legislative session when the BIPA was passed, suggesting that the legislature intended a similar framework to apply to both statutes.

So, what are some of the requirements of the GIPA?

The GIPA is largely based on the federal Genetic Information Nondiscrimination Act (the “GINA”) and incorporates several terms and concepts from the Privacy Rule under the Health Insurance Portability and Accountability Act (the “HIPAA”). This includes the definition of the term “genetic information” which is defined under HIPAA Reg. 45 CFR 160.103 and includes the manifestation disease in a family member, which includes one’s spouse. GIPA also includes requirements applicable to genetic testing companies, health care providers, business associates, insurers, and employers.

While not an exhaustive list of requirements, in general, under GIPA:

  • Genetic testing and information derived from genetic testing is confidential and privileged and may be released only to the individual tested and to persons specifically authorized, in writing in accordance with Section 30 of GIPA, by that individual to receive the information.
  • An insurer may not seek information derived from genetic testing for use in connection with a policy of accident and health insurance.
  • An insurer shall not use or disclose protected health information that is genetic information for underwriting purposes. Examples of “underwriting purposes” include: (i) determining eligibility (including enrollment and continued eligibility) for benefits under the plan, coverage, or policy (including changes in deductibles or other cost-sharing mechanisms in return for activities such as completing a health risk assessment or participating in a wellness program), (ii) the computation of premium or contribution amounts under the plan, coverage, or policy (including discounts in return for activities, such as completing a health risk assessment or participating in a wellness program); and (iii) other activities related to the creation, renewal, or replacement of a contract of health insurance or health benefits.
  • Companies providing direct-to-consumer commercial genetic testing are prohibited from sharing any genetic test information or other personally identifiable information about a consumer with any health or life insurance company without written consent from the consumer.
  • Employers must treat genetic testing and genetic information consistent with the requirements of federal law, including but not limited to the GINA, the Americans with Disabilities Act, Title VII of the Civil Rights Act of 1964, the Family and Medical Leave Act of 1993, the Occupational Safety and Health Act of 1970, and certain other laws.
  • Employers may permit the disclosure of genetic testing information only in accordance with the GIPA.
  • Employers may not (i) solicit, request, require or purchase genetic testing or genetic information of a person or a family member of the person, or administer a genetic test to a person or a family member of the person as a condition of employment; (ii) affect the terms, conditions, or privileges of employment, or terminate the employment of any person because of genetic testing or genetic information with respect to the employee or family member; or (iii) retaliate against any person alleging a violation of this Act or participating in any manner in a proceeding under the GIPA.
  • Employers cannot use genetic information or genetic testing for workplace wellness programs benefiting employees unless (1) health or genetic services are offered by the employer, (2) the employee provides written authorization in accordance with the GIPA, (3) only the employee (or family member if the family member is receiving genetic services) and the licensed health care professional or licensed genetic counselor involved in providing such services receive individually identifiable information concerning the results of such services, and (4) any individually identifiable information is only available for purposes of such services and shall not be disclosed to the employer except in aggregate terms that do not disclose the identity of specific employees. Employers can not penalize employees who do not disclose their genetic information or choose not to participate in a program requiring disclosure of the employee’s genetic information.

Whether an organization is a health care provider, a genetic testing companies, an employer, or other company subject to the GIPA, it should review its policies and practices concerning genetic tests and genetic information. In Melvin v. Sequencing, LLC, the plaintiff alleges his genetic information was disclosed without his authorization. Based on our preliminary research we could find no other cases addressing violations of the GIPA, so this may be a sign of more to come.  Note also that Illinois is not the only state with laws protecting genetic information.

In honor of Data Privacy Day, we provide the following “Top 10 for 2021.”  While the list is by no means exhaustive, it does provide some hot topics for organizations to consider in 2021.

  1. COVID-19 privacy and security considerations.

During 2020, COVID-19 presented organizations large and small with new and unique data privacy and security considerations. Most organizations, particularly in their capacity as employers, needed to adopt COVID-19 screening and testing measures resulting in the collection of medical and other personal information from employees and others. This will continue in 2021 with the addition of vaccination programs. So, for 2021, ongoing vigilance will be needed to maintain the confidential and secure collection, storage, disclosure, and transmission of medical and COVID-19 related data that may now include tracking data related to vaccinations or the side effects of vaccines.

Several laws apply to data the organizations may collect. In the case of employees, for example, the Americans with Disability Act (ADA) requires maintaining the confidentiality of employee medical information and this may include COVID-19 related data. Several state laws also have safeguard requirements and other protections for such data that organization should be aware of when they or others on their behalf process that information.

Many employees will continue to telework during 2021. A remote workforce creates increased risks and vulnerabilities for employers in the form of sophisticated phishing email attacks or threat actors gaining unauthorized access through unsecured remote access tools. It also presents privacy challenges for organizations trying to balance business needs and productivity with expectations of privacy. These risks and vulnerabilities can be addressed and remediated through periodic risk assessments, robust remote work and bring your own device policies, and routine monitoring.

As organizations work to create safe environments for the return of workers, customers, students, patients and visitors, they may rely on various technologies such as wearables, apps, devices, kiosks, and AI designed to support these efforts. These technologies must be reviewed for potential privacy and security issues and implemented in a manner that minimizes legal risk.

Some reminders and best practices when collecting and processing information referred to above and rolling out these technologies include:

  • Complying with applicable data protection laws when data is collected, shared, secured and stored including the ADA, Genetic Information Nondiscrimination Act, CCPA, GDPR and various state laws. This includes providing required notice at collection under the California Consumer Privacy Act (CCPA), or required notice and a documented lawful basis for processing under the GDPR, if applicable.
  • Complying with contractual agreements regarding data collection; and
  • Contractually ensuring vendors who have has access to or collect data on behalf of the organization implement appropriate measures to safeguard the privacy and security of that data.
  1. The California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA)

On January 1, 2020, the CCPA ushered in a range of new rights for consumers, including:

  • The right to request deletion of personal information;
  • The right to request that a business disclose the categories of personal information collection and the categories of third parties to which the information was sold or disclosed; and
  • The right to opt-out of sale of personal information; and
  • The California consumer’s right to bring a private right of action against a business that experiences a data breach affecting their personal information as a result of the business’s failure to implement “reasonable safeguards.”

The CCPA carves-out (albeit not entirely) employment-related personal information from the CCPA’s provisions. It limits employee rights to notice of the categories of personal information collected by the business and the purpose for doing so, and the right to bring a private right of action against a business that experiences a data breach affecting their personal information.

In November, California voters passes the California Privacy Rights Act (CPRA) which amends and supplements the CCPA, expanding compliance obligations for companies and consumer rights. Of particular note, the CPRA extends the employment-related personal information carve-out until January 1, 2023. The CPRA also introduces consumer rights relating to certain sensitive personal information, imposes an affirmative obligation on businesses to implement reasonable safeguards to protect certain consumer personal information, and prevents businesses from retaliating against employees for exercising their rights.  The CPRA’s operative date is January 1, 2023 and draft implementation regulations are expected by July 1, 2022. Businesses should monitor CCPA/CPRA developments and ensure their privacy programs and procedures remain aligned with current CCPA compliance requirements.

In 2021, businesses can expect various states, including Washington, New York, and Minnesota to propose or enact CCPA-like legislation.

  1. Biometric Data

There was a continued influx of biometric privacy class action litigation in 2020 and this will likely continue in 2021. In early 2019, the Illinois Supreme Court handed down a significant decision concerning the ability of individuals to bring suit under the Illinois’s Biometric Information Privacy Act (BIPA). In short, individuals need not allege actual injury or adverse effect beyond a violation of his/her rights under BIPA to qualify as an aggrieved person and be entitled to seek liquidated damages, attorneys’ fees and costs and injunctive relief under the Act.

Consequently, simply failing to adopt a policy required under BIPA, collecting biometric information without a release or sharing biometric information with a third party without consent could trigger liability under the statute. Potential damages are substantial as BIPA provides for statutory damages of $1,000 per negligent violation or $5,000 per intentional or reckless violation of the Act. There continues to be a flood of BIPA litigation, primarily against employers with biometric timekeeping/access systems that have failed to adequately notify and obtain written releases from their employees for such practices.

Like many aspects of 2020, biometric class action litigation has also been impacted by COVID-19. Screening programs in the workplace may involve the collection of biometric data, whether by a thermal scanner, facial recognition scanner or other similar technology. In late 2020, plaintiffs’ lawyers filed a class action lawsuit on behalf of employees concerning their employer’s COVID-19 screening program, which is alleged to have violated the BIPA. According to the complaint, employees were required to undergo facial geometry scans and temperature scans before entering company warehouses, without prior consent from employees as required by law. More class action lawsuits of this nature are likely on the horizon.

The law in this area is still lagging behind the technology but starting to catch up. In addition to Illinois’s BIPA, Washington and Texas have similar laws, and states including Arizona, Florida, Idaho, Massachusetts and New York have also proposed such legislation. The proposed biometric law in New York would mirror Illinois’ BIPA, including its private right of action provision. In California, the CCPA also broadly defines biometric information as one of the categories of personal information protected by the law.

Additionally, states are increasingly amending their breach notification laws to add biometric information to the categories of personal information that require notification, including 2020 amendments in California, D.C., and Vermont. Similar proposals across the U.S. are likely in 2021.

A report released by Global Market Insights, Inc. in November 2020 estimates the global market valuation for voice recognition technology will reach approximately $7 billion by 2026, in main part due to the surge of AI and machine learning across a wide array of devices including smartphones, healthcare apps, banking apps and connected cars, just to name a few. Voice recognition is generally classified as a biometric technology which allows the identification of a unique human characteristic (e.g. voice, speech, gait, fingerprints, iris or retina patterns), and as a result voice related data qualifies biometric information and in turn personal information under various privacy and security laws. For businesses exploring the use of voice recognition technology, whether for use by their employees to access systems or when manufacturing a smart device for consumers or patients, there are a number of privacy and security compliance obligations to consider including the CCPA, GDPR, state data breach notification laws, BIPA, COPPA, vendor contract statutes, statutory and common law safeguarding mandates.

  1. HIPAA

During 2020, the Office of Civil Rights (OCR) at the U.S. Department of Health and Human Services was active in enforcing HIPAA regulations. The past year saw more than $13.3 million recorded by OCR in total resolution agreements. OCR settlements have impacted a wide array of health industry-related businesses, including hospitals, health insurers, business associates, physician clinics and mental health/substance abuse providers. Twelve of these settlements where under the OCR’s Right to Access Initiative, which enforces patients’ rights to timely access of medical records at reasonable cost. It is likely this level of enforcement activity will continue in 2021.

The past year produced a significant amount of OCR-issued guidance relating to HIPAA. In March OCR issued back-to-back guidance on COVID-19-related issues, first regarding the provision of protected health information (PHI) of COVID-19 exposed individuals to first responders, and next providing FAQs for telehealth providers. In July, the director of the OCR issued advice to HIPAA subject entities in response to the influx of recent OCR enforcement actions: “When informed of potential HIPAA violations, providers owe it to their patients to quickly address problem areas to safeguard individuals’ health information.” Finally in September, the OCR published best practices for creating an IT asset inventory list to assist healthcare providers and business associates in understanding where electronic protected health information (ePHI) is located within their organization and improve HIPAA Security Rule compliance, and shortly after it issued updated guidance on HIPAA for mobile health technology.

In December, Congress amended the Health Information Technology for Economic and Clinical Health Act to require the Secretary of Health and Human Services to consider certain recognized security practices of covered entities and business associates when making certain determination, and for other purposes. In 2021, businesses will want to review their information security practices in light of applicable recognized security practices in an effort to demonstrate reasonable safeguards and potentially minimize penalties in the event of a cybersecurity incident.

  1. Data Breaches

The past year was marked by an escalation in ransomware attacks, sophisticated phishing emails, and business email compromises. Since many of these attacks were fueled in part by vulnerabilities due to an increased remote workforce, 2021 will likely be more of the same. Continue Reading Top 10 for 2021 – Happy Data Privacy Day!