The Illinois Supreme Court recently agreed to hear an appeal of an Appellate Court’s decision addressing whether an employee’s claim for damages under Illinois’s Biometric Information Protection Act is preempted by the exclusivity provisions of the Illinois Workers’ Compensation Act (“IWCA”). Back in September, the Illinois Appellate Court for the First Judicial District held that employees’ BIPA claims were not preempted under the Illinois Workers’ Compensation (IWCA) and could go forward.

The BIPA requires companies that collect and use biometric information to establish a policy and obtain a written release prior to collecting such data. Under the BIPA, individuals may sue for violations and, if successful, can recover liquidated damages ranging from $1,000 (or actual damages, whichever is greater) for negligent violations to $5,000 for intentional or reckless violations — plus attorneys’ fees and costs.

Over the past few years there has been a significant number of lawsuits under the BIPA, particularly after the Illinois Supreme Court held in 2019, in Rosenbach v. Six Flags,  that individuals need not allege actual injury or adverse effect, beyond a violation of his/her rights under BIPA, in order to qualify as an “aggrieved” person and be entitled to seek liquidated damages, attorneys’ fees and costs, and injunctive relief under the Act. A key defense for employers defending BIPA lawsuits has been that the BIPA is preempted by the IWCA.

The plaintiff in Illinois Supreme Court’s most recent case alleged that that their employer violated BIPA by requiring that employees use a fingerprint time clock system without properly: (1) informing the employees in advance and in writing of the specific purpose and length of time for which their fingerprints were being collected, stored, and used; (2) providing a publicly available retention schedule and guidelines for permanently destroying the scanned fingerprints; and (3) obtaining a written release from the employees prior to the collection of their fingerprints.  The employer moved to dismiss the complaint based on several arguments, including the assertion that the plaintiff’s claims would be barred by the exclusivity provisions of the IWCA.  The trial court denied the motion the dismiss, but certified the question for appeal regarding whether the IWCA exclusivity provisions bar a claim for statutory damages under the BIPA.

In September of 2020, the Appellate Court emphasized that the IWCA generally provides the exclusive means by which an employee can recover against an employer for a work-related injury, however an employee can escape the exclusivity provisions of the IWCA if the employee establishes that the injury: 1) was not accidental, 2) did not arise from their employment, 3) was not received during the course of employment or 4) was not compensable under the IWCA.  Focusing on the fourth exception, the Appellate Court concluded that a BIPA claim limited to statutory damages is not an injury compensable under the IWCA, and thus the plaintiff’s claims qualified under the fourth exception and were not preempted by the IWCA.

The Appellate Court, relying on Rosenbach, highlighted that because actual harm is not required under the BIPA to maintain a statutory damages claim, it does not,

“[f]it within the purview of the Compensation Act, which is a remedial statute designed to provide financial protection for workers that have sustained an actual injury.”

The Illinois Supreme Court has now granted leave to appeal the Appellate Court’s ruling, addressing the issue of whether injuries resulting from BIPA claims fall under the scope of the IWCA. While there is no telling how the Supreme Court will ultimately rule, it certainly leaves open the possibility that the Court’s decision will help reign in the significant number of lawsuits, including putative class actions, filed under the BIPA.

If they have not already done so, companies should immediately take steps to comply with the statute. That is, they should review their time management, point of purchase, physical security, or other systems that obtain, use, or disclose biometric information (any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry used to identify an individual) against the requirements under the BIPA. In the event they find technical or procedural gaps in compliance – such as not providing written notice, obtaining a release from the subject of the biometric information, obtaining consent to provide biometric information to a third party, or maintaining a policy and guidelines for the retention and destruction of biometric information – they need to quickly remedy those gaps.  For additional information on complying with the BIPA, please see our BIPA FAQs.

Virginia may be the first state to follow California’s lead on consumer privacy legislation, but it certainly will not be the last. The International Association of Privacy Professionals (IAPP) observed, “State-Level momentum for comprehensive privacy bills is at an all-time high.” The IAPP maintains a map of state consumer privacy legislative activity, with in-depth analysis comparing key provisions. We discuss the Virginia legislation here, along with legislative activity in several other states that seem likely to pass. It was California that enacted the first data breach notification law which became effective in 2003. In about 15 years’ time, all U.S. states have such a law, as well as many jurisdictions around the world.

Whether it is the pending Virginia Consumer Data Protection Act (VCDPA), the California Consumer Privacy Act (CCPA), or a similar framework, there are several features that should be considered when examining the effects of such laws on an organization:

  • Does the law apply? Neither the CCPA nor the VCDPA apply to all organizations doing business in the state. But, they may apply more broadly than initially assumed, including organizations without locations in the particular state. Also, some entities that control or are controlled by covered businesses also could become subject to one of these laws even if such entities would not otherwise fall into the law’s scope. Finally, data privacy and security laws increasingly reach third-party service providers to covered organizations either directly or indirectly through contracts that covered organizations must put in place.
  • Are we exempt? Perhaps just as important as whether an organization is covered by one of these laws is the question of whether an exemption applies. It is important to know that while an organization may not be exempt as a whole, certain classifications data it maintains may be. For example, under the CCPA, “protected health information” covered by the Health Insurance Portability and Accountability Act (HIPAA) is generally exempt from the law. Of course, that information comes with its own compliance obligations!
  • What is Personal Information? Assuming an organization is covered by the law, the next question it may want to ask is what data is covered. As we have discussed, there are various definitions and understandings of personal information.  Similar to the CCPA and General Data Protection Regulation (GDPR), the VCDPA would define personal data broadly to include “any information that is linked or reasonably linkable to an identified or identifiable natural person.” Again, this broad definition should be read together with potential exemptions to obtain a firm understanding of the information within the scope of the law’s protections. In some cases, such as under the GDPR, and the amendment to the CCPA, the California Privacy Rights Act, there is a subset of personal information that comes with even more protections. Often referred to as “sensitive personal information,” this category can include personally identifiable information such as racial or ethnic origin, religious beliefs, mental or physical health diagnosis, sexual orientation, citizenship or immigration status, genetic or biometric data, and geolocation data. Of course, covered organizations with these categories of data would need to understand those additional requirements.
  • Who is protected? It is not enough to know what kind of information that is “personal information,” covered organizations also need to know whose personal information is protected under the law. Several of these laws protect “consumers” defined generally as natural persons who reside in the jurisdiction. Basing the analysis solely on the word “consumer” and assuming that does not include employees, students, website visitors, etc. might be a mistake. Some frameworks have specific exclusions for these and other categories, others do not.
  • What rights do protected persons have? Ostensibly, a key purpose for this kind of privacy legislation is to empower individuals with respect to their personal information. That is, to give them more access to and control over their data that is collected, used, disclosed, maintained, and sold . To effectively comply with these measures, covered organizations need to understand the kinds of rights granted. These rights can include:
    • The right to know what personal information is collected and processed, why, and to access such personal information
    • To right to correct inaccuracies in the personal information
    • To right to delete personal information
    • The right to limit processing of personal information
    • The right to opt out of the processing or sale of personal information
  • Can my organization be sued for violations of the law? It is important to understand the consequences of failing to comply with any law. The flood of litigation under the Illinois Biometric Information Privacy Act (BIPA) which permits substantial recovery for failing to comply with notice and other requirements, even without a showing of actual harm, confirms the importance of examining this issue. Several of these privacy frameworks, including the CCPA and legislation supported by Governor DeSantis in Florida, include a private right of action in connection with data breaches.
  • How will the law be enforce? Related to the question of whether consumers can sue for violations is how the law will be enforced, what are the potential penalties, and how are they measured. In most cases, enforcement rests with the state’s Attorney General’s office. Often, the law requires covered organizations be provided written notice of any violation and a period of time to cure the violation. Compliance can be challenging so covered organizations should be aware of a law’s enforcement scheme so that in cases where their compliance efforts may not be perfect, they have a plan in place for quickly acting on such notices and curing any violations.

Answering these questions is certainly not the end of the analysis. For example, if covered, there are a whole host of additional questions organizations need to ask in order to evaluate compliance needs, allocate resources, identify affected business units, weigh risk management objectives, manage vendor compliance, and implement new policies and procedures, as needed. However, these questions can help to sharpen the big picture on the effect one or more of these privacy laws may have on your organization.

 

The California Privacy Rights Act (CPRA), passed in November, 2020, added to the California Consumer Privacy Act (CCPA) an express obligation for covered businesses to adopt reasonable security safeguards to protect personal information. The CPRA also clarified the CCPA’s private right of action for consumers whose personal information is breached due to a failure to implement such safeguards. But, remember, reasonable security safeguards are already required under California law, and that requirement is not limited to businesses subject to the CCPA/CPRA.

The CPRA adds subsection (e) to Cal. Civ. Code 1798.100, as follows:

A business that collects a consumer’s personal information shall implement reasonable security procedures and practices appropriate to the nature of the personal information to protect the personal information from unauthorized or illegal access, destruction, use, modification, or disclosure in accordance with Section 1798.81.5.

California Civil Code section 1798.81.5 requires a business that:

owns, licenses, or maintains personal information about a California resident shall implement and maintain reasonable security procedures and practices appropriate to the nature of the information, to protect the personal information from unauthorized access, destruction, use, modification, or disclosure.

Unlike the CCPA/CPRA, section 1798.81.5 defines “business” more broadly to include “a sole proprietorship, partnership, corporation, association, or other group, however organized and whether or not organized to operate at a profit.” Thus, even if the CCPA, as amended by the CPRA, does not apply to your business, California law still may require the business to have reasonable security safeguards.

The meaning of “reasonable safeguards” is not entirely clear in California.  One place to look, however, is in the California Data Breach Report former California Attorney General and now Vice President, Kamala D. Harris, issued in February, 2016. According to that report, an organization’s failure to implement all of the 20 controls set forth in the Center for Internet Security’s Critical Security Controls constitutes a lack of reasonable security.

So, although the CPRA generally is operative on January 1, 2023, California businesses might look to the 20 CIS controls at least as a starting point for securing personal information. With regard to which personal information to secure to minimize exposure under the CCPA/CPRA’s private right of action, the law is a bit more clear.

The CCPA extended the private right of action for data breaches only to personal information “defined in subparagraph (A) of paragraph (1) of subdivision (d) of Section 1798.81.5”:

(A)  An individual’s first name or first initial and the individual’s last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted or redacted:

(i) Social security number.

(ii) Driver’s license number, California identification card number, tax identification number, passport number, military identification number, or other unique identification number issued on a government document commonly used to verify the identity of a specific individual.

(iii) Account number or credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account.

(iv) Medical information.

(v) Health insurance information.

(vi) Unique biometric data generated from measurements or technical analysis of human body characteristics, such as a fingerprint, retina, or iris image, used to authenticate a specific individual. Unique biometric data does not include a physical or digital photograph, unless used or stored for facial recognition purposes.

The CPRA added to this list, a consumer’s “email address in combination with a password or security question and answer that would permit access to the account.”

In the event a CCPA-covered business experiences a data breach involving personal information, the CCPA authorized a private cause of action against the business if a failure to implement reasonable security safeguards caused the breach. If successful, a plaintiff can seek to recover statutory damages in an amount not less than $100 and not greater than $750 per consumer per incident or actual damages, whichever is greater, as well as injunctive or declaratory relief and any other relief the court deems proper. This means that plaintiffs generally do not have to show actual harm to recover. In case you were wondering, CCPA data breach litigation has already commenced.

To bring such an action under the CCPA, a consumer must provide the business 30 days’ written notice specifying the violation and giving the business an opportunity to cure. If cured under the CCPA, no action may be initiated against the business for statutory damages. However, the CPRA clarifies that businesses cannot cure a failure to have reasonable safeguards before the breach:

implementation and maintenance of reasonable security procedures and practices pursuant to Section 1798.81.5 following a breach does not constitute a cure with respect to that breach.

The CPRA also calls for additional regulations requiring businesses whose processing of consumers’ personal information presents significant risk to consumers’ privacy or security, to (i) perform a cybersecurity audit on an annual basis, and (ii) submit to the California Privacy Protection Agency on a regular basis a risk assessment concerning the processing of personal information.

There is more to come following the passage of the CPRA, and businesses should be monitoring CCPA/CPRA developments. However, it is critical to ensure reasonable security safeguards are in place to protect personal information.

Enacted in 2008, the Illinois Biometric Information Privacy Act, 740 ILCS 14 et seq. (the “BIPA”), went largely unnoticed until a few years ago when a handful of cases sparked a flood of class action litigation over the collection, use, storage, and disclosure of biometric information. Seeing thousands of class action lawsuits, organizations have reevaluated and redoubled their compliance efforts. On January 28, 2021, a complaint was filed in Cook County, IL, Melvin v. Sequencing, LLC, alleging violations of the Illinois Genetic Information Privacy Act, 410 ILCS 513/1 – the “GIPA”…try not to get confused… which was originally effective in 1998.

Will the GIPA follow the BIPA?

The GIPA creates a private right of action using the same language as the BIPA:

Any person aggrieved by a violation of this Act shall have a right of action in a State circuit court or as a supplemental claim in a federal district court against an offending party.

However, while the BIPA provides for liquidated damages of $1,000 for each negligent violation and $5,000 for each intentional or reckless violation (or actual damages, if greater), the liquidated damages provisions under the GIPA are significantly higher: $2,500 and $15,000, respectively. If the holding of the Illinois Supreme Court in Rosenbach v. Six Flags Entertainment Corp., No. 123186 (Ill. Jan. 25, 2019) with regard to the BIPA is applied to the GIPA, plaintiffs could potentially maintain a cause of action and seek liquidated damages resulting from alleged violations of the GIPA, without any showing of actual injury beyond his or her rights under the Act.

Of note, in Sekura v. Krishna Schaumburg Tan, Inc., 2018 IL App (1st 180175), the Illinois Appellate Court for the First Judicial District noted, in a pre-Rosenbach BIPA case, that the GIPA “provide[s] for a substantially identical, ‘any person aggrieved’ right of recovery” as the BIPA.  The First District noted that the GIPA was considered and amended during the same legislative session when the BIPA was passed, suggesting that the legislature intended a similar framework to apply to both statutes.

So, what are some of the requirements of the GIPA?

The GIPA is largely based on the federal Genetic Information Nondiscrimination Act (the “GINA”) and incorporates several terms and concepts from the Privacy Rule under the Health Insurance Portability and Accountability Act (the “HIPAA”). This includes the definition of the term “genetic information” which is defined under HIPAA Reg. 45 CFR 160.103 and includes the manifestation disease in a family member, which includes one’s spouse. GIPA also includes requirements applicable to genetic testing companies, health care providers, business associates, insurers, and employers.

While not an exhaustive list of requirements, in general, under GIPA:

  • Genetic testing and information derived from genetic testing is confidential and privileged and may be released only to the individual tested and to persons specifically authorized, in writing in accordance with Section 30 of GIPA, by that individual to receive the information.
  • An insurer may not seek information derived from genetic testing for use in connection with a policy of accident and health insurance.
  • An insurer shall not use or disclose protected health information that is genetic information for underwriting purposes. Examples of “underwriting purposes” include: (i) determining eligibility (including enrollment and continued eligibility) for benefits under the plan, coverage, or policy (including changes in deductibles or other cost-sharing mechanisms in return for activities such as completing a health risk assessment or participating in a wellness program), (ii) the computation of premium or contribution amounts under the plan, coverage, or policy (including discounts in return for activities, such as completing a health risk assessment or participating in a wellness program); and (iii) other activities related to the creation, renewal, or replacement of a contract of health insurance or health benefits.
  • Companies providing direct-to-consumer commercial genetic testing are prohibited from sharing any genetic test information or other personally identifiable information about a consumer with any health or life insurance company without written consent from the consumer.
  • Employers must treat genetic testing and genetic information consistent with the requirements of federal law, including but not limited to the GINA, the Americans with Disabilities Act, Title VII of the Civil Rights Act of 1964, the Family and Medical Leave Act of 1993, the Occupational Safety and Health Act of 1970, and certain other laws.
  • Employers may permit the disclosure of genetic testing information only in accordance with the GIPA.
  • Employers may not (i) solicit, request, require or purchase genetic testing or genetic information of a person or a family member of the person, or administer a genetic test to a person or a family member of the person as a condition of employment; (ii) affect the terms, conditions, or privileges of employment, or terminate the employment of any person because of genetic testing or genetic information with respect to the employee or family member; or (iii) retaliate against any person alleging a violation of this Act or participating in any manner in a proceeding under the GIPA.
  • Employers cannot use genetic information or genetic testing for workplace wellness programs benefiting employees unless (1) health or genetic services are offered by the employer, (2) the employee provides written authorization in accordance with the GIPA, (3) only the employee (or family member if the family member is receiving genetic services) and the licensed health care professional or licensed genetic counselor involved in providing such services receive individually identifiable information concerning the results of such services, and (4) any individually identifiable information is only available for purposes of such services and shall not be disclosed to the employer except in aggregate terms that do not disclose the identity of specific employees. Employers can not penalize employees who do not disclose their genetic information or choose not to participate in a program requiring disclosure of the employee’s genetic information.

Whether an organization is a health care provider, a genetic testing companies, an employer, or other company subject to the GIPA, it should review its policies and practices concerning genetic tests and genetic information. In Melvin v. Sequencing, LLC, the plaintiff alleges his genetic information was disclosed without his authorization. Based on our preliminary research we could find no other cases addressing violations of the GIPA, so this may be a sign of more to come.  Note also that Illinois is not the only state with laws protecting genetic information.

In honor of Data Privacy Day, we provide the following “Top 10 for 2021.”  While the list is by no means exhaustive, it does provide some hot topics for organizations to consider in 2021.

  1. COVID-19 privacy and security considerations.

During 2020, COVID-19 presented organizations large and small with new and unique data privacy and security considerations. Most organizations, particularly in their capacity as employers, needed to adopt COVID-19 screening and testing measures resulting in the collection of medical and other personal information from employees and others. This will continue in 2021 with the addition of vaccination programs. So, for 2021, ongoing vigilance will be needed to maintain the confidential and secure collection, storage, disclosure, and transmission of medical and COVID-19 related data that may now include tracking data related to vaccinations or the side effects of vaccines.

Several laws apply to data the organizations may collect. In the case of employees, for example, the Americans with Disability Act (ADA) requires maintaining the confidentiality of employee medical information and this may include COVID-19 related data. Several state laws also have safeguard requirements and other protections for such data that organization should be aware of when they or others on their behalf process that information.

Many employees will continue to telework during 2021. A remote workforce creates increased risks and vulnerabilities for employers in the form of sophisticated phishing email attacks or threat actors gaining unauthorized access through unsecured remote access tools. It also presents privacy challenges for organizations trying to balance business needs and productivity with expectations of privacy. These risks and vulnerabilities can be addressed and remediated through periodic risk assessments, robust remote work and bring your own device policies, and routine monitoring.

As organizations work to create safe environments for the return of workers, customers, students, patients and visitors, they may rely on various technologies such as wearables, apps, devices, kiosks, and AI designed to support these efforts. These technologies must be reviewed for potential privacy and security issues and implemented in a manner that minimizes legal risk.

Some reminders and best practices when collecting and processing information referred to above and rolling out these technologies include:

  • Complying with applicable data protection laws when data is collected, shared, secured and stored including the ADA, Genetic Information Nondiscrimination Act, CCPA, GDPR and various state laws. This includes providing required notice at collection under the California Consumer Privacy Act (CCPA), or required notice and a documented lawful basis for processing under the GDPR, if applicable.
  • Complying with contractual agreements regarding data collection; and
  • Contractually ensuring vendors who have has access to or collect data on behalf of the organization implement appropriate measures to safeguard the privacy and security of that data.
  1. The California Consumer Privacy Act (CCPA) and the California Privacy Rights Act (CPRA)

On January 1, 2020, the CCPA ushered in a range of new rights for consumers, including:

  • The right to request deletion of personal information;
  • The right to request that a business disclose the categories of personal information collection and the categories of third parties to which the information was sold or disclosed; and
  • The right to opt-out of sale of personal information; and
  • The California consumer’s right to bring a private right of action against a business that experiences a data breach affecting their personal information as a result of the business’s failure to implement “reasonable safeguards.”

The CCPA carves-out (albeit not entirely) employment-related personal information from the CCPA’s provisions. It limits employee rights to notice of the categories of personal information collected by the business and the purpose for doing so, and the right to bring a private right of action against a business that experiences a data breach affecting their personal information.

In November, California voters passes the California Privacy Rights Act (CPRA) which amends and supplements the CCPA, expanding compliance obligations for companies and consumer rights. Of particular note, the CPRA extends the employment-related personal information carve-out until January 1, 2023. The CPRA also introduces consumer rights relating to certain sensitive personal information, imposes an affirmative obligation on businesses to implement reasonable safeguards to protect certain consumer personal information, and prevents businesses from retaliating against employees for exercising their rights.  The CPRA’s operative date is January 1, 2023 and draft implementation regulations are expected by July 1, 2022. Businesses should monitor CCPA/CPRA developments and ensure their privacy programs and procedures remain aligned with current CCPA compliance requirements.

In 2021, businesses can expect various states, including Washington, New York, and Minnesota to propose or enact CCPA-like legislation.

  1. Biometric Data

There was a continued influx of biometric privacy class action litigation in 2020 and this will likely continue in 2021. In early 2019, the Illinois Supreme Court handed down a significant decision concerning the ability of individuals to bring suit under the Illinois’s Biometric Information Privacy Act (BIPA). In short, individuals need not allege actual injury or adverse effect beyond a violation of his/her rights under BIPA to qualify as an aggrieved person and be entitled to seek liquidated damages, attorneys’ fees and costs and injunctive relief under the Act.

Consequently, simply failing to adopt a policy required under BIPA, collecting biometric information without a release or sharing biometric information with a third party without consent could trigger liability under the statute. Potential damages are substantial as BIPA provides for statutory damages of $1,000 per negligent violation or $5,000 per intentional or reckless violation of the Act. There continues to be a flood of BIPA litigation, primarily against employers with biometric timekeeping/access systems that have failed to adequately notify and obtain written releases from their employees for such practices.

Like many aspects of 2020, biometric class action litigation has also been impacted by COVID-19. Screening programs in the workplace may involve the collection of biometric data, whether by a thermal scanner, facial recognition scanner or other similar technology. In late 2020, plaintiffs’ lawyers filed a class action lawsuit on behalf of employees concerning their employer’s COVID-19 screening program, which is alleged to have violated the BIPA. According to the complaint, employees were required to undergo facial geometry scans and temperature scans before entering company warehouses, without prior consent from employees as required by law. More class action lawsuits of this nature are likely on the horizon.

The law in this area is still lagging behind the technology but starting to catch up. In addition to Illinois’s BIPA, Washington and Texas have similar laws, and states including Arizona, Florida, Idaho, Massachusetts and New York have also proposed such legislation. The proposed biometric law in New York would mirror Illinois’ BIPA, including its private right of action provision. In California, the CCPA also broadly defines biometric information as one of the categories of personal information protected by the law.

Additionally, states are increasingly amending their breach notification laws to add biometric information to the categories of personal information that require notification, including 2020 amendments in California, D.C., and Vermont. Similar proposals across the U.S. are likely in 2021.

A report released by Global Market Insights, Inc. in November 2020 estimates the global market valuation for voice recognition technology will reach approximately $7 billion by 2026, in main part due to the surge of AI and machine learning across a wide array of devices including smartphones, healthcare apps, banking apps and connected cars, just to name a few. Voice recognition is generally classified as a biometric technology which allows the identification of a unique human characteristic (e.g. voice, speech, gait, fingerprints, iris or retina patterns), and as a result voice related data qualifies biometric information and in turn personal information under various privacy and security laws. For businesses exploring the use of voice recognition technology, whether for use by their employees to access systems or when manufacturing a smart device for consumers or patients, there are a number of privacy and security compliance obligations to consider including the CCPA, GDPR, state data breach notification laws, BIPA, COPPA, vendor contract statutes, statutory and common law safeguarding mandates.

  1. HIPAA

During 2020, the Office of Civil Rights (OCR) at the U.S. Department of Health and Human Services was active in enforcing HIPAA regulations. The past year saw more than $13.3 million recorded by OCR in total resolution agreements. OCR settlements have impacted a wide array of health industry-related businesses, including hospitals, health insurers, business associates, physician clinics and mental health/substance abuse providers. Twelve of these settlements where under the OCR’s Right to Access Initiative, which enforces patients’ rights to timely access of medical records at reasonable cost. It is likely this level of enforcement activity will continue in 2021.

The past year produced a significant amount of OCR-issued guidance relating to HIPAA. In March OCR issued back-to-back guidance on COVID-19-related issues, first regarding the provision of protected health information (PHI) of COVID-19 exposed individuals to first responders, and next providing FAQs for telehealth providers. In July, the director of the OCR issued advice to HIPAA subject entities in response to the influx of recent OCR enforcement actions: “When informed of potential HIPAA violations, providers owe it to their patients to quickly address problem areas to safeguard individuals’ health information.” Finally in September, the OCR published best practices for creating an IT asset inventory list to assist healthcare providers and business associates in understanding where electronic protected health information (ePHI) is located within their organization and improve HIPAA Security Rule compliance, and shortly after it issued updated guidance on HIPAA for mobile health technology.

In December, Congress amended the Health Information Technology for Economic and Clinical Health Act to require the Secretary of Health and Human Services to consider certain recognized security practices of covered entities and business associates when making certain determination, and for other purposes. In 2021, businesses will want to review their information security practices in light of applicable recognized security practices in an effort to demonstrate reasonable safeguards and potentially minimize penalties in the event of a cybersecurity incident.

  1. Data Breaches

The past year was marked by an escalation in ransomware attacks, sophisticated phishing emails, and business email compromises. Since many of these attacks were fueled in part by vulnerabilities due to an increased remote workforce, 2021 will likely be more of the same. Continue Reading Top 10 for 2021 – Happy Data Privacy Day!

The CCPA has reached the one-year mark. This is a good time for businesses to review the success of their compliance programs and recalibrate for the CCPA’s second year. Here are a few suggestions to kick off that review:

  1. Privacy Policies. The CCPA requires a business to update the information in its privacy policy or any California-specific description of consumers’ privacy rights at least once every twelve months. If a business has not already done so, now is a good time to review both online and offline data collection practices to ensure privacy policies accurately disclose, at a minimum, the categories of personal information (“PI”) it collected in the preceding 12 months, the categories of PI it sold in the preceding twelve months, and the categories of PI it disclosed for a business purpose in the last 12 months.

Given the challenges of the last several months, a business may be collecting PI beyond what it currently discloses in its privacy policies. For example, a company may need to update its privacy policies to disclose the collection and use of COVID-19 related screening information, biometric information, or PI collected as a result of remote work situations.

If the business needs to update its privacy policy to reflect additional data collection activities, it will likely need to update its “notices at collection”, including employee and job applicant privacy notices.

  1. Employee training. The CCPA provides that a business shall ensure all individuals responsible for handling inquiries on consumer rights, the businesses’ privacy practices, or its compliance with the CCPA are informed of applicable CCPA requirements. Businesses will want to review their training programs to ensure they now include appropriate CCPA related training; determine whether employee handbooks and manuals have been updated accordingly; and, document that relevant employees have received training.
  2. Reasonable Safeguards. The CCPA does not currently impose an affirmative obligation on a business to implement reasonable safeguards to protect consumer PI; however, it provides a consumer private right of action where the consumer’s PI has been involved in a data breach resulting from the business’s failure to implement reasonable security safeguards. As a best practice, a business will want to review whether it has performed a risk assessment, at least annually, to identify new or enhanced risks, threats, or vulnerabilities to its systems or the PI it collects or maintains; whether it has reviewed and updated its written information security program and data retention schedule; and whether it has practiced its incident response plan.

CCPA compliance is an ongoing activity and these three action items are particularly worthy of review at the one-year mark. However, further year-end review might also include an assessment of the business’s website’s accessibility; confirmation that service provider agreements have been amended to satisfy the CCPA, where appropriate; and all new service provider contracts include relevant CCPA provisions.

Although this post is focused on the CCPA, it is important to note that California recently passed the Consumer Privacy Rights Act (“CPRA”). The CPRA supplements and amends the CCPA. Two CPRA provisions are worth noting as they relate to items on this action item list. First, effective January 1, 2023, businesses will have an affirmative obligation to implement reasonable safeguards. Second, businesses will be required to disclose their collection and use of “sensitive personal information” and shall permit individuals to limit the business’s use of this information in certain circumstances. By adding these new provisions, the CPRA builds upon and expands the CCPA, inching it a bit closer to the EU General Data Protection Regulation.

The California Privacy Rights Act of 2020 (CPRA) becomes operative on January 1, 2023. Among its numerous amendments and additions to the existing California Consumer Privacy Act (CCPA), the CPRA expands the definition of Personal Information. Specifically, it adds the category of Sensitive Personal Information. This new category tracks the EU General Data Protection Regulation’s definition of Special Category Data, adds data elements commonly viewed in the U.S. as sensitive, and introduces a new twist by including the contents of a consumer’s mail, email, and text messages.

The CPRA broadly defines Sensitive Personal Information as Personal Information that is not publicly available and reveals:

  • a consumer’s social security, driver’s license, state identification card, or passport number;
  • a consumer’s account log-In, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account;
  • a consumer’s precise geolocation;
  • a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership;
  • the contents of a consumer’s mail, email and text messages, unless the business is the intended recipient of the communication;
  •  a consumer’s genetic data; and
  • the processing of biometric information for the purpose of uniquely identifying a consumer;
  •  personal information collected and analyzed concerning a consumer’s health; or
  • personal information collected and analyzed concerning a consumer’s sex life or sexual orientation.

The addition of this new category of Personal Information creates two primary obligations for businesses. First, a business will need to include Sensitive Personal Information in its notice at collection to consumers, including job applicants and employees, and in any online privacy policy or CA specific description of consumer rights. Under the CPRA, this notice must now also disclose the categories of Sensitive Personal Information to be collected, the purposes for which they will be used, whether this information will be sold or shared, and the length of time the business intends to retain each category of Sensitive Personal Information.

Second, when a business collects or processes Sensitive Personal Information for the purpose of “inferring characteristics” about a consumer, it may only do so to provide services or goods requested by the consumer, for limited purposes enumerated by the CPRA, and as authorized by future implementation regulations. If the business intends to use or disclose this information for any other purpose, it must provide the consumer with notice of the intended use or disclosure and the consumer’s right to limit this use or disclosure. To facilitate exercising this right, a business must provide the consumer with an opt out mechanism entitled “Limit the Use of My Sensitive Personal Information.” Sensitive Personal Information that is not collected or processed for the purpose of inferring a consumer’s characteristics is not subject to this right to limit its use or disclosure.

Although the GDPR and CPRA share similar definitions of sensitive data, there are two significant differences worth noting. The GDPR prohibits collecting and processing Special Category Data absent receiving the explicit, informed, affirmative (i.e., opt in) consent of the individual to do so, or pursuant to limited circumstances enumerated in the GDPR. In contrast, the CPRA permits collecting and processing Sensitive Personal Information. However, the consumer may limit (i.e., opt out of) the use and disclosure of this data when a business collects it for the purpose of inferring the consumer’s characteristics and will use or disclose it beyond what is necessary to provide requested service or goods to the consumer, and as narrowly permitted by the CCPA and any implementation regulations.

In anticipation of January 1, 2023, preparations should include revisiting or expanding existing data mapping activities to identify the collection of Sensitive Personal Information, reviewing the purpose for collecting this information and how the business uses or discloses it, and determining whether its use or disclosure is permitted or authorized by the CPRA. Similar to preparations for the CCPA, this will require an interdisciplinary team with a broad understanding of business operations. Any team should include members familiar with the business’ advertising, marketing, and website data collection activities to help identify where Sensitive Personal Information may be collected for the purpose of inferring consumer characteristics.

For additional information on the CPRA, please reach out to a member of the Jackson Lewis Privacy, Data and Cybersecurity practice group or check out our CPRA blog series:

A new report released by Global Market Insights, Inc. last month estimates that the global market valuation for voice recognition technology will reach approximately $7 billion by 2026, in main part due to the surge of AI and machine learning across a wide array of devices including smartphones, healthcare apps, banking apps and connected cars, just to name a few. Whether performing a quick handsfree search on your phone or car command while driving, voice recognition technology has enhanced the effortlessness of consumer use. Particularly in the wake of the COVID-19 pandemic, companies that may never have considered voice-recognition technology are now rethinking their employee access control systems, and considering touchless authorization technologies, like voice recognition, as the main form of entry into their workspace, as opposed to fingerprint scanners or keypads that increase the risk of germs or virus spreading.

But while the ease and efficiency of voice recognition technology is clear, the privacy and security obligations associated with this technology cannot be overlooked. Voice recognition is generally classified as a biometric technology which allows the identification of a unique human characteristic (e.g. voice, speech, gait, fingerprints, iris or retina patterns), and as a result voice related data qualifies biometric information and in turn personal information under various privacy and security laws. For businesses that want to deploy voice recognition technology, whether for use by their employees to access systems or when manufacturing a smart device for consumers or patients, there are a number of privacy and security compliance obligations to consider. Here are just a few:

  • EU’s General Data Protection Regulation (GDPR)
    • The GDPR, effective since May of 2018, classifies “voice” as “personal data”. While GDPR Article 4.1 which defines “personal data” does not specifically refer to “voice” but rather, “one or several properties unique to their physical, physiological identity…”, the European Data Protect Board has taken the position that “voice recognition” is an example of a physical or physiological biometric identification technique. For businesses that process the personal data of data subjects (EU residents), those data subjects are granted an array of rights (e.g. right to access, right to delete) along with significant privacy and security obligations on the controllers and processors of that data.
  • California Consumer Privacy Act (CCPA)
    • The recently enacted California Consumer Privacy Act(CCPA) may apply to a business that collects the personal data of a California resident, regardless of whether the organization is located in California. Under the Act, a covered business must provide a resident with information about its data collection practices including the personal information it collects, discloses, and sells, as well as the right to delete to this data and object to its sale. Notably, the Act prohibits an individual from waiving these rights.  The CCPA includes “biometric information” as an enumerated category of “personal information.”. In the Act’s definition of “biometric information” it states that “[b]iometric information includes, but is not limited to, imagery of the iris, retina, fingerprint, face, hand, palm, vein patterns, and voice recordings, from which an identifier template, such as a faceprint, a minutiae template, or a voiceprint, can be extracted”.
  • Biometric Information Privacy Act (BIPA)
    • The BIPA sets forth a comprehensive set of rules for companies doing business in Illinois when collecting biometric identifiers or information of state residents. The BIPA has several key features: • Informed consent prior to collection • Limited right of disclosure of biometric information • Written policy requirement addressing retention and data destruction guidelines • Prohibition on profiting from biometric data. The definition of “biometric identifiers” under the BIPA includes a “voiceprint” (using voice to verify an individual’s identity). Voiceprinting has been the subject of significant BIPA litigation of late, particularly in the context of virtual assistants. While these cases have been tossed for reasons unrelated to voiceprinting itself (e.g. lack of personal jurisdiction), as plaintiffs continue to expand the scope of BIPA targets, companies utilizing voiceprinting will increasingly face exposure to BIPA ligation.
  • Children’s Online Privacy Protection Act (COPPA)
    • Under COPPA there are strict consent requirements for collection and storage of data of children under 13. That said, in 2017, the Federal Trade Commission issued guidance on COPPA in the context of voice recordings, relaxing the rule a bit, “The Commission recognizes the value of using voice as a replacement for written words in performing search and other functions on internet-connected devices. Verbal commands may be a necessity for certain consumers, including children who have not yet learned to write or the disabled… as such when a covered operator collects an audio file containing a child’s voice solely as a replacement for written words, such as to perform a search or fulfill a verbal instruction or request, but only maintains the file for the brief time necessary for that purpose, the FTC would not take an enforcement action against the operator on the basis that the operator collected the audio file without first obtaining verifiable parental consent. Such an operator, however, must provide the notice required by the COPPA Rule, including clear notice of its collection and use of audio files and its deletion policy, in its privacy policy.” While the FTC has to-date not issued any COPPA violations in the context of voice recordings, its requirements should not be ignored.
  • State Statutory and Common Law Mandates to Safeguard Personal Data
    • Multiple states impose an affirmative duty to use reasonable measures to safeguard personal data that an organization collects or owns, which increasingly includes biometric information. The applicability of these laws may depend on the location of the organization’s facilities and the consumer/employee/patient’s state of residency. Many of these safeguarding laws provide a general framework for compliance, without mandating specific measures. However, “reasonable” generally implies safeguards appropriate to the sensitivity of the data, and one need only look to more robust data security frameworks, such as under HIPAA and the Massachusetts data security regulations, to get a sense of what safeguards may be appropriate. These statutory duties to safeguard are driving increased contractual obligations between businesses exchanging personal information to carry out the terms of the agreement. At the same time, some courts have identified common law duties to safeguard personal data.
  • State Mandates Regarding Data Destruction and Disposal
    • Currently, more than thirty states have data destruction and disposal laws. These laws require taking reasonable steps to securely dispose of records containing personal information by shredding, erasing or other methods. States such as Massachusetts include biometric information as a category of personal information subject to these requirements. Organizations should also implement a data retention schedule that ensures the destruction of biometric information, including voiceprints, once it is no longer needed as part of meaningful data destruction practices.
  • State Data Breach Notification Laws
    • All fifty U.S. states have data breach notification laws. In general, these laws require an entity that owns or licenses personal information about a state resident to report a data breach to individuals whose personal information is affected and, in some cases, the state attorney general or other agencies, the media, and credit reporting agencies. Each state has its own definition of personal information, and states such as California, Texas, Florida, and Arizona include health, medical, and/or biometric information. Unauthorized acquisition or access to such personal information, whether by hackers or employee error, can require notifications to individuals creating significant exposure and reputational harm to the organization. Perhaps a greater concern from such a compromise is the exfiltration of voiceprint data that could be used by hackers as credentials to access other user accounts, etc.
  • Vendor Contract Statutes
    • An increasing number of states including California, Massachusetts, New York, and Oregon statutorily require a business to conduct due diligence before sharing or disclosing certain categories of personal information to a third-party service provider, which likely include biometric information. Many of these statutes also require contractually obligating the vendor to maintain safeguards appropriate to the sensitivity of the data, which is a good practice even if a written agreement is not mandated by the statute.

Conclusion

Voice recognition technology is booming, and continues to infiltrate different facets of life that are hard to even contemplate. The technology brings innumerable potential benefits as well as significant data privacy and cybersecurity risks. Organizations that collect, use, and store voice data increasingly face compliance obligations as the law attempts to keep pace with technology, cybersecurity crimes and public awareness of data privacy and security. Creating a robust data protection program or regularly reviewing an existing one is a critical risk management and legal compliance step.

 

Facial-recognition tech creates service, security options | Hotel ManagementThe City of Portland, Oregon becomes the first city in the United States to ban the use of facial recognition technologies in the private sector citing, among other things, a lack of standards for the technology and wide ranges in accuracy and error rates that differ by race and gender. Failure to comply can be painful. Similar to the remedy available under the Illinois Biometric Information Privacy Act, fueling hundreds of class action lawsuits, the Ordinance provides persons injured by a material violation a cause of action for damages or $1,000 per day for each day of violation, whichever is greater. The Ordinance is effective January 1, 2021.

Facial recognition technology has become more popular in recent years, including during the COVID-19 pandemic. As the need arose to screen persons entering a facility for symptoms of the virus, including temperature, thermal cameras, kiosks, and other devices with embedded with facial recognition capabilities were put into use. However, many have objected to the use of this technology in its current form, citing problems with the accuracy of the technology, as summarized in a June 9, 2020 New York Times article, “A Case for Banning Facial Recognition.”

Under the Ordinance, a “private entity” shall not use “face recognition technologies” in “places of public accommodation” in the boundaries of the City of Portland. Facial recognition technologies under the Ordinance means

an automated or semi-automated process that assists in identifying, verifying, detecting, or characterizing facial features of an individual or capturing information about an individual based on an individual’s face

Places of public accommodation include any place or service offering to the public accommodations, advantages, facilities, or privileges whether in the nature of goods, services, lodgings, amusements, transportation or otherwise. This covers just about any private business and organization. Note, Portland also passed a separate ordinance prohibiting the use of facial recognition technology by the city government.

There are some exceptions, however. Places of public accommodation do not include “an institution, bona fide club, private residence, or place of accommodation that is in its nature distinctly private.” It is not clear from the Ordinance what it means to be “distinctly private.” Also, the Ordinance does not apply:

  • When facial recognition technologies are necessary to comply with federal state or local law,
  • For user verification purposes by an individual to access the individual’s own personal or employer issued communication and electronic devices, or
  • To automatic face detection services in social media applications.

So, in Portland, employees can still let their faces get them into their phones, including their company-provided devices. But, businesses in Portland should evaluate whether they are using facial recognition technologies, whether they fall into one of the exceptions in the ordinance, and if not what alternatives they have for verification, security, and other purposes for which the technology was implemented.

Despite several attempts, Congress has struggled to push forward a federal consumer privacy law over the past few years. But the COVID-19 pandemic, which has raised concerns regarding location monitoring, GPS tracking and use of health data, has heightened the urgency for federal consumer privacy legislation. In May, a group of Democrats from the U.S. Senate and House of Representatives introduced the Public Health Emergency Privacy Act (“the Act”), aimed to protect health information during the pandemic and regulate the use of that data with contact tracing technologies.

In late July, the Senate Committee of Appropriations introduced an Emergency Coronavirus Stimulus Package (“the Stimulus Package”) which would allocate $53 million of the $306 million package, to the Department of Homeland Security Cybersecurity and Infrastructure Security Agency for the protection of Coronavirus research data and related data. In addition, a group of 13 senators including Kamala Harris, D-California, Elizabeth Warren, D-Massachusetts, and Mark Warner, D-Virginia, sent a letter to Senate and Congressional leadership, asking for the Act to be included in the passage of the Stimulus Package.

“Health data is among the most sensitive data imaginable and even before this public health emergency, there has been increasing bipartisan concern with gaps in our nation’s health privacy laws,” the Senators stated in their letter.

“While a comprehensive update of health privacy protections is unrealistic at this time, targeted reforms to protect health data – particularly with clear evidence that a lack of privacy protections has inhibited public participation in screening activities – is both appropriate and necessary,” they added.

Under the Act, “Covered Organizations” is defined as “any person that collects, uses, or discloses  emergency health data electronically or  through communication by wire or radio; OR that develops or operates a website, web application, mobile application, mobile operating system feature, or smart device application for the purpose of tracking, screening, monitoring, contact tracing, or mitigation, or otherwise responding to the COVID–19 public health emergency.” NOTE:  Covered Organizations do not include: a health care provider; a person engaged in a de minimis collection or processing of emergency health data; a service provider; a person acting in their individual or household capacity; or a public health authority.

The Act would protect “emergency health data” which means “data linked or reasonably linkable to an individual or device, including data inferred or derived about the individual or device from other collected data provided such data is still linked or reasonably linkable to the individual or device, that concerns the public COVID–19 health emergency.” Examples of such data include:

  • information that reveals the past, present, or future physical or behavioral health or condition of, or provision of healthcare to, an individual, including data derived from testing an individual. This likely would include COVID-19 viral or serological test results, along with genetic data, biological samples, and biometrics;
  • other data collected in conjunction with other emergency health data or for the purpose of tracking, screening, monitoring, contact tracing, or mitigation, or otherwise responding to the COVID–19 public health emergency, such as (i) geolocation and similar information for determining the past or present precise physical location of an individual at a specific point in time, (ii) proximity data that identifies or estimates the past or present physical proximity of one individual or device to another, including information derived from Bluetooth, audio signatures, nearby wireless networks, and near-field communications; and (iii) any other data collected from a personal device.

Below are key requirements of the Act for Covered Organizations:

  • Only collect, use or disclose data that is necessary, proportionate and limited for a good-faith health purpose;
  • Take reasonable measures, where possible, to ensure the accuracy of data and provide a mechanism for individuals to correct inaccuracies;
  • Adopt reasonable safeguards to prevent unlawful discrimination on the basis of emergency health data;
  • Only disclose data to a government entity if it is to a public health authority and is solely for good faith public health purposes;
  • Establish and implement reasonable data security policies, practices and procedures;
  • Obtain affirmative express consent before collecting, using or disclosing emergency health data, and provide individuals with an effective mechanism to revoke that consent. NOTE: There are limited exceptions where consent is not required including to protect from fraud/malicious activity, to prevent a security incident, or if otherwise required by law;
  • Provide notice in the form of a privacy policy prior to collection that describes how and for what purposes the data will be used (including categories of recipients), the organization’s data security policies and practices, and how individuals may exercise their rights.

If enacted, the Federal Trade Commission (FTC) would be required to promulgate rules regarding data collection, use and disclosure under the Act. In addition, both the FTC and state attorneys general would have enforcement authority over the Act.

The Act, if passed, would be a temporary measure that would terminate once COVID-19 was no longer deemed a public emergency. Covered organizations would be required to not use or maintain emergency health data 60 days after the termination of the public health emergency, and destroy or render not linkable such data.

With no comprehensive Federal privacy framework in place, the Senators are urging Congressional leadership to allow for a measure that provides “Americans with assurance that their sensitive health data will not be misused will give Americans more confidence to participate in COVID screening efforts, strengthening our common mission in containing and eradicating COVID-19”.

We will continue to update on the status of the Act and other related developments.