Assessing the privacy and cybersecurity practices of third-party service providers is critical not only for employee personal information, but also for confidential and personal information pertaining to an organization’s business and its clients, customers, patients, students, etc. The Federal Trade Commission (FTC) announced a settlement on December 15 with a financial institution that it claimed failed to oversee the data security practices of one of its third-party service providers as required under the  Gramm-Leach Bliley Act’s Safeguards Rule.

The Safeguards Rule requires financial institutions to develop, implement, and maintain a comprehensive information security program. As part of that program, financial institutions must oversee their third-party vendors, by ensuring they are capable of implementing and maintaining appropriate safeguards for customer information, and requiring them to do so by contract. The FTC alleged the financial institution in this case failed to do this.

Oversight of vendors is a critical part of any comprehensive data security program, particularly where those vendors can put sensitive consumer data at risk,” said Andrew Smith, Director of the FTC’s Bureau of Consumer Protection. “If you’re a financial company, vendor oversight is not just a good idea, it’s the law.”

In this case, the FTC alleges the financial institution’s vendor, which performed text recognition scanning on mortgage documents, stored the contents of the documents on a cloud-based server in plain text, without any protections to block unauthorized access, such as requiring a password or encrypting the information. And, according to the FTC, the financial institution (i) failed to adequately vet the vendor at issue and other vendors; (ii) did not have safeguard requirements in all vendor contracts; and (iii) did not conduct risk assessments of all of its third-party vendors, as required under the Safeguards Rule. Unfortunately, the complaint claims the server was accessed dozens of times and the documents on the server contained sensitive information about mortgage holders and others, such as names, dates of birth, Social Security numbers, loan information, credit and debit account numbers, drivers’ license numbers, or credit files.

It is important to note that similar statutory and regulatory requirements exist at the state level and at the federal level outside of the financial services industry. Here are some examples:

  • Under HIPAA, covered entities that work with certain third parties, known as business associates, must enter into “business associate agreements” setting out extensive contractual obligations on the business associate for privacy and security, which also apply directly to the business associate.
  • The New York Stop Hacks and Improve Electronic Data Security Act (SHIELD Act) requires “[a]ny person or business which owns or licenses computerized data which includes private information” of a resident of New York to “select[] service providers capable of maintaining appropriate safeguards, and require[] those safeguards by contract.”
  • Businesses subject to the data security regulations in Massachusetts, 201 CMR 17.00, must oversee service providers by (i) taking reasonable steps to select and retain those that are capable of maintaining appropriate security measures to protect such personal information consistent with the Massachusetts regulations and any applicable federal regulations, and (ii) requiring such service providers by contract to implement and maintain such appropriate security measures.
  • Several other states have similar requirements, including California, Colorado, Oregon, and Rhode Island.

The FTC’s proposed settlement requires the financial institution to, among other things:

  • undergo biennial assessments of the effectiveness of its data security program by an independent organization, which the FTC has authority to approve.
  • have a senior company executive annually certify the institution is complying with the final FTC order.
  • report any future data breaches to the FTC within 10 days of notifying other federal or state government agencies.

We discussed here some steps organizations could take to assess their third-party service providers’ capabilities concerning privacy and data security. Of course, these are not the only steps that an organization might include in a vendor management program. Those steps would be a function of the organization’s own risk assessment of the nature and extent of the sharing and processing of sensitive data it engages in with third-party service providers. Of course, at a minimum, any organization should be sure the master services agreement with the vendor includes a requirement that reasonable safeguards concerning personal information be maintained by the vendor. Regardless of the actual steps taken to address this risk, organizations should be regularly assessing the privacy and cybersecurity risks presented by third-party service providers and how to address them. And, remember that as many such organizations are themselves service providers, they too may find themselves under increased scrutiny in this regard.

Setting up that new IoT device you received for Christmas? Maybe you’ve been derelict in feeding the dog and found a smart dog feeder under the tree, one that will alert you that Luna has been fed or that you have to refill the feeder. Smart gizmos are not just for the home, approximately 25% of businesses use Internet of Things (IoT) technology, a figure only expected to grow substantially. With that growth will be new and varied applications for IoT technology, along with a need to understand the different kinds of risks it presents. Earlier this month, on December 4, 2020, President Trump signed the Internet of Things Cybersecurity Improvement Act of 2020 (Act). The Act is directed at federal agencies, but is likely to have a significant impact in the private sector as well.

Passed by the House in September 2020, the Act mandates a cybersecurity framework be created for the appropriate use and management by federal agencies of IoT devices owned or controlled by an agency and connected to information systems owned or controlled by an agency. Perhaps that most notable provision of the Act is for contractors of federal agencies and their subcontractors – effective two years from enactment, December 5, 2022, and subject to limited opportunities for a waiver, federal agencies will be:

prohibited from procuring or obtaining, renewing a contract to procure or obtain, or using an Internet of Things device, if the Chief Information Officer of that agency determines during a review required by section 11319(b)(1)(C) of title 40, United States Code, of a contract for such device that the use of such device prevents compliance with the standards and guidelines developed under [the Act].

What are the Standards and Guidelines to be Developed under the Act?

Within 90 days following enactment, the Act requires the Director of the National Institute of Standards and Technology (NIST) to develop and publish standards and guidelines on the appropriate use and management by federal agencies of IoT devices they own or control and which are connected to information systems they own or control. Along with bearing in mind standards, guidelines, and best practices developed by the private sector, agencies, and public-private partnerships, the Director also must consider the following for IoT devices:

  • Secure Development.
  • Identity management.
  • Patching.
  • Configuration management.

In addition, within 180 days following enactment, the Director must publish guidelines for reporting, coordinating, publishing, and receiving of information about security vulnerabilities relating to information systems owned or controlled by an agency (including IoT devices) and resolving those vulnerabilities. The Director also must provide guidance for contractors and subcontractors on receiving information on potential information system vulnerabilities and disseminating information about resolutions.

What Does This Mean for IoT Devices?

For federal contractors and subcontractors, it will mean closely tracking and incorporating published security standards and guidelines by NIST, as well as being prepared to receive and act on information about potential security vulnerabilities received from federal agencies concerning devices and systems, and disseminate information on resolutions for those vulnerabilities. However, the Act also may establish recognized best practices for IoT devices, resulting broader adoption in the private sector. In the meantime, NIST has already started developing the standards and guidelines that will flow from the Act.

One of the last things pension plan participants would want to learn as they get ready to celebrate the Christmas holiday is that personal data from their pension accounts may have been compromised. This is the case, unfortunately, for approximately 30,000 Now:Pensions customers whose names, postal and email addresses, birth dates and the equivalent of Social Security numbers were hacked and posted on line. According to reports, the UK company, which helps to administer millions of workplace pensions, attributed the incident to a third-party service provider.

Of course, the challenge of managing the cybersecurity risk of third-party service providers does not exist solely across the pond. During a recent SPARK Cybersecurity Virtual Event, Tim Hauser, Deputy Assistant Secretary for National Office Operations at DOL’s Employee Benefts Security Administration (EBSA), observed

When a plan fiduciary is hiring somebody who is going to be responsible for confidential, personal information, or who’s going to be running systems to keep track of people’s account balances and the like, there’s a responsibility to make sure that you’ve hired that person prudently, that firm prudently…And if you think about plans and the universe I described, that’s just shy of $11 trillion, and with personal health and pension data, there are a lot of tempting targets there and what we’ve seen in our own enforcement actions, especially in our criminal programs, vulnerabilities are taken advantage of.

According to Hauser, the U.S. Department of Labor is developing guidance for plan sponsors in the U.S. that would cover cybersecurity issues and third-party service providers for retirement plans.

Just as so many other organizations affected by a breach experienced by one of their third-party service providers, Now:Pensions has provided notification to pension account holders and regulators. Reports indicate the breach occurred over a three-day period in mid-December and the compromised data had been obtained “by an unknown third party.”

At this point, similarly-situated organizations might be considering whether to move away from the service provider that caused the incident. Here are some reasons why that may not be the best course of action. However, one to-do list item that should be a given following a breach like this is to revisit the procurement process for selecting service providers, update it as needed to make sure it appropriately addresses cybersecurity risks, and ensure it is prudently implemented.

When it comes to ERISA employee benefit plans, hiring a service provider is in and of itself a fiduciary function. When considering a plan service provider’s level of cybersecurity, there are a number of steps plan sponsors and administrators can take to prudently assess the data privacy and security capabilities of potential plan service providers. Some examples include:

  • Take the general threats and vulnerabilities of plan service providers into account when conducting the organization’s enterprise data security risk assessment.
  • Meet with the service provider’s IT lead, but also others in the service provider’s organization – legal, accounting, HR, sales, etc. This will give you a better sense of the culture of privacy and security at the service provider.
  • Require the service provider to complete a detailed list of pointed data privacy and security questions, the answers to which to be actively evaluated by your IT team, counsel, and/or consultant.
  • Ask about prior data security incidents and how they were handled.
  • Review the service provider’s policies and procedures.
  • Require the service provider to submit to an independent data security audit/review, penetration test.
  • Ask the service provider about its data breach response plan, and how often it is practiced. Plan to include the service provider when you practice your own response plan, and gauge their openness to that.

This is not an exhaustive list, and each step could be fleshed out more or less depending on the risk the service provider presents. In addition, it is appropriate to incorporate appropriate representations and additional protections concerning data privacy and security in the ultimate services agreement. The point is that because of the critical role service providers play, and the information they have access to (which may include not just personal information but also company proprietary data), the measures taken to evaluate plan service providers privacy and data security risk should happen at the procurement stage and on an ongoing basis, not just when a breach happens.

In April of this year, which seems far longer than eight months ago, we posted about an alert from federal agencies warning that cyber threat actors were exploiting the coronavirus pandemic to fuel phishing and other attacks. Those efforts have continued throughout the year with attackers now retooling their messaging around the COVID-19 vaccine. Criminal threat actors know millions are clamoring for information about the vaccine and are working to meet that demand with false information, largely through phishing attacks.

According to an alert from the New Jersey Cybersecurity & Communications Integration Cell (NJCCIC):

COVID-19 vaccine-themed phishing emails may include subject lines that make reference to vaccine registration, information about vaccine coverage, locations to receive the vaccine, ways to reserve a vaccine, and vaccine requirements.

For business and/or personal reasons, millions are clamoring for vaccination information and may let their guard down when they see it. In the process, they may divulge sensitive or financial information, or open malicious links or attachments. Phishing campaigns may employ brand spoofing and impersonate well-known and trusted entities, such as government agencies playing a central and critical role in the response to COVID-19 and the vaccination rollout. Messages such as the one below, for example, can lure an individual to want to participate and provide helpful information.

Other forms of attack target individuals who want a vaccine with advertisements for supposed “legitimate” vaccines, but which are nothing of the sort.  Organizations such as New Jersey’s Office of Homeland Security and Preparedness are working to get accurate information about COVID-19 to the public, such as through its Rumor Control and Disinformation web page. However, having accurate information available may not do enough to foil these attacks.

Organizations may not be able to prevent all attacks, but there are steps they could take to minimize the chance and impact of a successful attack, and to be prepared to respond. Among those steps is the critical need to maintain a level of security awareness, in addition to training. Annual trainings are a start, but may not be enough to keep up with nimble threat actors who deftly reshape their messaging and methods to improve their chances of success. They take in developments around the world and adapting on a far more frequent basis than annually.

Employees should be trained to recognize phishing attacks and dangerous sites, and instructed not to reveal personal, financial or other confidential information about themselves, other employees, customers, and the company. However, ongoing reminders about the morphing nature of these kinds of attacks can be instrumental in preventing them. Considering the past year and the more recent rise in COVID-19 cases, it is easy to understand how compelling information about a vaccine can be, so much so that it may be easy to forget the warnings given during that annual training on an early Monday morning in February.

The California Privacy Rights Act of 2020 (CPRA) becomes operative on January 1, 2023. Among its numerous amendments and additions to the existing California Consumer Privacy Act (CCPA), the CPRA expands the definition of Personal Information. Specifically, it adds the category of Sensitive Personal Information. This new category tracks the EU General Data Protection Regulation’s definition of Special Category Data, adds data elements commonly viewed in the U.S. as sensitive, and introduces a new twist by including the contents of a consumer’s mail, email, and text messages.

The CPRA broadly defines Sensitive Personal Information as Personal Information that is not publicly available and reveals:

  • a consumer’s social security, driver’s license, state identification card, or passport number;
  • a consumer’s account log-In, financial account, debit card, or credit card number in combination with any required security or access code, password, or credentials allowing access to an account;
  • a consumer’s precise geolocation;
  • a consumer’s racial or ethnic origin, religious or philosophical beliefs, or union membership;
  • the contents of a consumer’s mail, email and text messages, unless the business is the intended recipient of the communication;
  •  a consumer’s genetic data; and
  • the processing of biometric information for the purpose of uniquely identifying a consumer;
  •  personal information collected and analyzed concerning a consumer’s health; or
  • personal information collected and analyzed concerning a consumer’s sex life or sexual orientation.

The addition of this new category of Personal Information creates two primary obligations for businesses. First, a business will need to include Sensitive Personal Information in its notice at collection to consumers, including job applicants and employees, and in any online privacy policy or CA specific description of consumer rights. Under the CPRA, this notice must now also disclose the categories of Sensitive Personal Information to be collected, the purposes for which they will be used, whether this information will be sold or shared, and the length of time the business intends to retain each category of Sensitive Personal Information.

Second, when a business collects or processes Sensitive Personal Information for the purpose of “inferring characteristics” about a consumer, it may only do so to provide services or goods requested by the consumer, for limited purposes enumerated by the CPRA, and as authorized by future implementation regulations. If the business intends to use or disclose this information for any other purpose, it must provide the consumer with notice of the intended use or disclosure and the consumer’s right to limit this use or disclosure. To facilitate exercising this right, a business must provide the consumer with an opt out mechanism entitled “Limit the Use of My Sensitive Personal Information.” Sensitive Personal Information that is not collected or processed for the purpose of inferring a consumer’s characteristics is not subject to this right to limit its use or disclosure.

Although the GDPR and CPRA share similar definitions of sensitive data, there are two significant differences worth noting. The GDPR prohibits collecting and processing Special Category Data absent receiving the explicit, informed, affirmative (i.e., opt in) consent of the individual to do so, or pursuant to limited circumstances enumerated in the GDPR. In contrast, the CPRA permits collecting and processing Sensitive Personal Information. However, the consumer may limit (i.e., opt out of) the use and disclosure of this data when a business collects it for the purpose of inferring the consumer’s characteristics and will use or disclose it beyond what is necessary to provide requested service or goods to the consumer, and as narrowly permitted by the CCPA and any implementation regulations.

In anticipation of January 1, 2023, preparations should include revisiting or expanding existing data mapping activities to identify the collection of Sensitive Personal Information, reviewing the purpose for collecting this information and how the business uses or discloses it, and determining whether its use or disclosure is permitted or authorized by the CPRA. Similar to preparations for the CCPA, this will require an interdisciplinary team with a broad understanding of business operations. Any team should include members familiar with the business’ advertising, marketing, and website data collection activities to help identify where Sensitive Personal Information may be collected for the purpose of inferring consumer characteristics.

For additional information on the CPRA, please reach out to a member of the Jackson Lewis Privacy, Data and Cybersecurity practice group or check out our CPRA blog series:

On December 10, 2020, the California Department of Justice (“Department”) announced a fourth set of modifications to the California Consumer Privacy Act’s (CCPA) regulations.  The deadline to submit comments to the modifications is Monday, December 28, 2020.

As a quick recap of past developments related to the CCPA regulations, the Department first published proposed regulations for public commentary on October 11,2019. Then in February of 2020, and again in March of 2020, the Department announced a second and third set of modifications to the proposed regulations, based on comments received during the public commentary period. Finally, in October of 2020, the Department issued a third set of modifications to the regulations, and received approximately 20 comments in response to those modifications. The fourth set of modifications issued this week, were developed in response to those comments, and to “clarify/conform” the proposed regulations to existed law.

The fourth set of modifications to the regulations, primarily aims to clarify ambiguities regarding a consumer’s right to opt out, as well as a company’s use of an opt out button and processing opt-out requests.

Regarding the right to opt out, the modifications clarify that a business selling personal information collected from consumers in the course of interacting with them offline shall inform consumers of their right to opt-out of the sale of their personal information by an offline method. The regulations provide examples to understand this clarification: for example, a business that sells personal information over the phone may inform consumers of their right to opt out orally during the call when the information is collected.

In addition, the latest set of modifications, re – introduced the opt-out button – providing the uniform logo that companies should use when implementing an opt-out button, as well as relevant instructions. It is worth noting that the opt-out button was initially introduced during the first set of modifications to the CCPA regulations, but was later removed, due to negative feedback from privacy advocates.

Here is what the opt-out button will look like:

The latest modifications also add a new section to the regulations, which emphasizes that an opt-out button:

  • May be used in addition to posting the notice of right to opt-out, but not in lieu of any requirement to post the notice of right to opt-out or a ‘Do Not Sell My Personal Information’ link as required by; and
  • Where a business posts the ‘Do Not Sell My Personal Information’ link, the opt-out button shall be added to the left of the text demonstrated below. The opt-out button shall link to the same Internet webpage or online location to which the consumer is directed after clicking on the ‘Do Not Sell My Personal Information’ link.

Finally, the latest modifications provide instructions on a business’s methods for submitting consumer requests to opt-out, highlighting that “requests to opt-out shall be easy for consumers to execute and shall require minimal steps to allow the consumer to opt-out.

The Department will accept written comments to the latest modifications to the CCPA regulations between Friday, December 11, 2020 and Monday, December 28, 2020. Written comments may be submitted to the Department via email to PrivacyRegulations@doj.ca.gov.

It remains to be seen whether these latest modifications to the CCPA regulations will in fact be the final round, but given the active history of modifications, it would not be surprising if there were more to come. Companies should continue to monitor CCPA developments, and ensure their privacy programs and procedures remain aligned with current compliance requirements.

 

A new report released by Global Market Insights, Inc. last month estimates that the global market valuation for voice recognition technology will reach approximately $7 billion by 2026, in main part due to the surge of AI and machine learning across a wide array of devices including smartphones, healthcare apps, banking apps and connected cars, just to name a few. Whether performing a quick handsfree search on your phone or car command while driving, voice recognition technology has enhanced the effortlessness of consumer use. Particularly in the wake of the COVID-19 pandemic, companies that may never have considered voice-recognition technology are now rethinking their employee access control systems, and considering touchless authorization technologies, like voice recognition, as the main form of entry into their workspace, as opposed to fingerprint scanners or keypads that increase the risk of germs or virus spreading.

But while the ease and efficiency of voice recognition technology is clear, the privacy and security obligations associated with this technology cannot be overlooked. Voice recognition is generally classified as a biometric technology which allows the identification of a unique human characteristic (e.g. voice, speech, gait, fingerprints, iris or retina patterns), and as a result voice related data qualifies biometric information and in turn personal information under various privacy and security laws. For businesses that want to deploy voice recognition technology, whether for use by their employees to access systems or when manufacturing a smart device for consumers or patients, there are a number of privacy and security compliance obligations to consider. Here are just a few:

  • EU’s General Data Protection Regulation (GDPR)
    • The GDPR, effective since May of 2018, classifies “voice” as “personal data”. While GDPR Article 4.1 which defines “personal data” does not specifically refer to “voice” but rather, “one or several properties unique to their physical, physiological identity…”, the European Data Protect Board has taken the position that “voice recognition” is an example of a physical or physiological biometric identification technique. For businesses that process the personal data of data subjects (EU residents), those data subjects are granted an array of rights (e.g. right to access, right to delete) along with significant privacy and security obligations on the controllers and processors of that data.
  • California Consumer Privacy Act (CCPA)
    • The recently enacted California Consumer Privacy Act(CCPA) may apply to a business that collects the personal data of a California resident, regardless of whether the organization is located in California. Under the Act, a covered business must provide a resident with information about its data collection practices including the personal information it collects, discloses, and sells, as well as the right to delete to this data and object to its sale. Notably, the Act prohibits an individual from waiving these rights.  The CCPA includes “biometric information” as an enumerated category of “personal information.”. In the Act’s definition of “biometric information” it states that “[b]iometric information includes, but is not limited to, imagery of the iris, retina, fingerprint, face, hand, palm, vein patterns, and voice recordings, from which an identifier template, such as a faceprint, a minutiae template, or a voiceprint, can be extracted”.
  • Biometric Information Privacy Act (BIPA)
    • The BIPA sets forth a comprehensive set of rules for companies doing business in Illinois when collecting biometric identifiers or information of state residents. The BIPA has several key features: • Informed consent prior to collection • Limited right of disclosure of biometric information • Written policy requirement addressing retention and data destruction guidelines • Prohibition on profiting from biometric data. The definition of “biometric identifiers” under the BIPA includes a “voiceprint” (using voice to verify an individual’s identity). Voiceprinting has been the subject of significant BIPA litigation of late, particularly in the context of virtual assistants. While these cases have been tossed for reasons unrelated to voiceprinting itself (e.g. lack of personal jurisdiction), as plaintiffs continue to expand the scope of BIPA targets, companies utilizing voiceprinting will increasingly face exposure to BIPA ligation.
  • Children’s Online Privacy Protection Act (COPPA)
    • Under COPPA there are strict consent requirements for collection and storage of data of children under 13. That said, in 2017, the Federal Trade Commission issued guidance on COPPA in the context of voice recordings, relaxing the rule a bit, “The Commission recognizes the value of using voice as a replacement for written words in performing search and other functions on internet-connected devices. Verbal commands may be a necessity for certain consumers, including children who have not yet learned to write or the disabled… as such when a covered operator collects an audio file containing a child’s voice solely as a replacement for written words, such as to perform a search or fulfill a verbal instruction or request, but only maintains the file for the brief time necessary for that purpose, the FTC would not take an enforcement action against the operator on the basis that the operator collected the audio file without first obtaining verifiable parental consent. Such an operator, however, must provide the notice required by the COPPA Rule, including clear notice of its collection and use of audio files and its deletion policy, in its privacy policy.” While the FTC has to-date not issued any COPPA violations in the context of voice recordings, its requirements should not be ignored.
  • State Statutory and Common Law Mandates to Safeguard Personal Data
    • Multiple states impose an affirmative duty to use reasonable measures to safeguard personal data that an organization collects or owns, which increasingly includes biometric information. The applicability of these laws may depend on the location of the organization’s facilities and the consumer/employee/patient’s state of residency. Many of these safeguarding laws provide a general framework for compliance, without mandating specific measures. However, “reasonable” generally implies safeguards appropriate to the sensitivity of the data, and one need only look to more robust data security frameworks, such as under HIPAA and the Massachusetts data security regulations, to get a sense of what safeguards may be appropriate. These statutory duties to safeguard are driving increased contractual obligations between businesses exchanging personal information to carry out the terms of the agreement. At the same time, some courts have identified common law duties to safeguard personal data.
  • State Mandates Regarding Data Destruction and Disposal
    • Currently, more than thirty states have data destruction and disposal laws. These laws require taking reasonable steps to securely dispose of records containing personal information by shredding, erasing or other methods. States such as Massachusetts include biometric information as a category of personal information subject to these requirements. Organizations should also implement a data retention schedule that ensures the destruction of biometric information, including voiceprints, once it is no longer needed as part of meaningful data destruction practices.
  • State Data Breach Notification Laws
    • All fifty U.S. states have data breach notification laws. In general, these laws require an entity that owns or licenses personal information about a state resident to report a data breach to individuals whose personal information is affected and, in some cases, the state attorney general or other agencies, the media, and credit reporting agencies. Each state has its own definition of personal information, and states such as California, Texas, Florida, and Arizona include health, medical, and/or biometric information. Unauthorized acquisition or access to such personal information, whether by hackers or employee error, can require notifications to individuals creating significant exposure and reputational harm to the organization. Perhaps a greater concern from such a compromise is the exfiltration of voiceprint data that could be used by hackers as credentials to access other user accounts, etc.
  • Vendor Contract Statutes
    • An increasing number of states including California, Massachusetts, New York, and Oregon statutorily require a business to conduct due diligence before sharing or disclosing certain categories of personal information to a third-party service provider, which likely include biometric information. Many of these statutes also require contractually obligating the vendor to maintain safeguards appropriate to the sensitivity of the data, which is a good practice even if a written agreement is not mandated by the statute.

Conclusion

Voice recognition technology is booming, and continues to infiltrate different facets of life that are hard to even contemplate. The technology brings innumerable potential benefits as well as significant data privacy and cybersecurity risks. Organizations that collect, use, and store voice data increasingly face compliance obligations as the law attempts to keep pace with technology, cybersecurity crimes and public awareness of data privacy and security. Creating a robust data protection program or regularly reviewing an existing one is a critical risk management and legal compliance step.

 

On November 3, 2020, Californians approved another significant piece of privacy rights legislation, the California Privacy Rights Act, or the CPRA.  The CPRA amends and expands the already (almost) infamous CCPA (California Consumer Privacy Act), which is the privacy law that went into effect in the Golden State last year.

New Rights under CPRA

The CPRA provides for, among other things, new and expanded rights for consumers.  The new rights under the CPRA include:

  • Right to Correct Information. A consumer may request that a business correct his or her personal information if it is inaccurate. Covered businesses must disclose this new right to consumers and use “commercially reasonable efforts” to correct personal information upon receiving a verifiable consumer request.
  • Right to Limit Sensitive Personal Information. The CPRA created a sub-category of personal information, labeled as “sensitive personal information”. The definition of sensitive personal information includes 20 different data points including for example, racial origin, religious beliefs, sexual orientation and geolocation. A consumer may limit the use and disclosure of sensitive information to that “which is necessary to perform the services or provide the goods reasonable expected by an average consumer who requests such goods and services,” subject to certain exemptions.  For example, a consumer may prohibit a business from disclosing sensitive personal information to third parties, in most cases.  A covered business is required to implement a process (like a clearly labeled link) to allow consumers to limit the use of sensitive personal information.
  • Right to Access Information About Automated Decision Making. Consumers may request information about the logic involved in automated decision-making and a description of the likely outcome of processes.
  • Right to Opt-Out of Automated Decision-Making Technology. Consumers are allowed to opt-out of the use of automated decision-making technology in connection with decisions about the consumer’s work performance, economic situation, health, personal preferences, interests, reliability, behavior, location, or movements.

Expanded and Modified Rights Under the CPRA

There are also several expanded and modified rights under the CPRA, including:

  • Expanded Right to Know. For personal information collected on or after January 1, 2022, the CPRA allows a consumer to make a request to know beyond the CCPA’s normal 12-month look-back period as long as doing so is not “impossible” or does not involve a “disproportionate” effort.  However, this expanded right does not require a business to keep personal information for any specific period of time.
  • Expanded Right to Opt Out. The CPRA expands the existing opt-out right to include both the sale and “sharing” of personal information, which is defined as the transfer or making available of a “consumer’s personal information by the business to a third party for cross-context behavioral advertising, whether or not for monetary or other valuable consideration.”
  • Modified Right to Delete. Businesses that receive a consumer deletion request are required to notify third parties who bought or received the consumer’s personal information, subject to some exceptions. Service providers and contractors also must pass the deletion request downstream in certain circumstances.
  • Expanded Right to Data Portability.A consumer may request that a business transmit his or her personal information to another entity, to the extent it is technical feasible.
  • Strengthened Opt-In Rights for Minors.  Businesses must wait 12 months before asking a minor for consent to sell or share his or her personal information after the minor has declined to provide it.

Employee Privacy Rights under the CPRA

The CPRA specifically calls out the privacy interests of employees, noting the differences in the relationships between employer and employee versus business and consumer.  (CPRA, Sec. 8, Purpose and Intent).  Like the CCPA, the full scope of rights afforded to consumers under the CPRA is not extended to applicants, employees, and independent contractors, and the CPRA keeps it that way until January 1, 2023, unless the CPRA is further amended. However, employees, applicants, and independent contractors do have the following rights (and employers should be putting processes in place to address these if they do not already have per the CCPA): 1) the right to receive notice at collection; and 2) the right to sue if their sensitive personal information is breached as a result of their employer not having reasonable safeguard in place.

Conclusion

Companies should continue to monitor CCPA/CPRA developments, and ensure their privacy programs and procedures remain aligned with current compliance requirements. And in case you missed it, here are the first two installments of our CPRA series:

 

In late September, the United States District Court for the Eastern District of Louisiana issued a first of its kind  ruling regarding the Telephone Consumer Privacy Act (“TCPA”). The court held that TCPA provision,  47 U.S.C. § 227(b)(1)(A)(iii) – which prohibits calls (and messages) made using an Automatic Telephone Dialing Systems (“ATDS”)to any cellular telephone number – is unenforceable retroactively for the 5-year period between November 2015, when Congress amended the TCPA to include an exemption for government-debt, until July 2020 when the U.S. Supreme Court ruled the government-debt exception was unconstitutional.

In July the Supreme Court held in Barr v. American Association of Political Consultants (“AAPC”) that Congress impermissibly favored government-debt collection speech over political and other speech, in violation of the First Amendment, and thus must invalidate the government-debt collection exception of the TCPA, and sever it from the remainder of the statute. Despite the potential that the Court would address the constitutionality of the TCPA in its entirety, the Court left untouched the TCPA’s general restriction on calls made with an “automatic telephone dialing system” (“ATDS”).

In response to the Supreme Court’s ruling, the federal court in Louisiana emphasized that,

Congress’s 2015 enactment of the government-debt exception rendered § 227(b)(1)(A)(iii) an unconstitutional content-based restriction on speech. In the years preceding Congress’s addition of the exception, § 227(b)(1)(A)(iii) did not discriminate on the content of robocalls, and was, as the Supreme Court has observed, a constitutional time-place-manner restriction on speech.  Likewise, now that [American Association of Political Consultants] has done away with the offending exception, § 227(b)(1)(A)(iii) figures to remain good law in the years to come.  However, in the years in which § 227(b)(1)(A)(iii) permitted robocalls of one category of content (government-debt collection) while prohibiting robocalls of all other categories of content, the entirety of the provision was, indeed, unconstitutional.

This groundbreaking Louisiana decision has already started a trend in court analysis of the issue. Only weeks later, a federal district court in Ohio issued a similar ruling granting defendant’s motion to dismiss in Lindenbaum v. Realgy Inc., in light of the retroactive impact of AAPC.

The plaintiffs in AAPC sought the right to speak going forward on the grounds that the statute, as written, is an unconstitutional content-based restriction. The Supreme Court denied that relief, but offered a remedy in the form of eliminating the content based restriction. But, in our case, severance of the content-based restriction does not offer a “remedy” to correct past harm. Here, defendants do not seek the right to speak, having already done so. They seek the right to be free from punishment for speaking during a time when an unconstitutional content-based restriction existed. A forward-looking fix offers no remedy for this past wrong.

As many currently active TCPA cases involve calls/texts/faxes sent between November 2015 and July 2020, these rulings have the potential to have an immediate and significant impact on TCPA class action litigation.  The rulings’ impact is heightened by the fact the courts dismissed each plaintiff’s claims on grounds that the court lacked subject matter jurisdiction as “federal courts lack authority to enforce unconstitutional laws.”  A subject matter jurisdiction dismissal is available in all phases of litigation, and cannot be waived, increasing the number of cases that could potentially be impacted by the court’s ruling.

Notably, the Louisiana court acknowledged the likelihood of a circuit split arising from its ruling, but placed culpability for this on the Supreme Court’s decision in AAPC that lacked a “clear majority” – Justice Kavanaugh authored a plurality decision. “The court’s failure to unite behind a sufficiently agreeable rationale does a disservice to litigants and lower courts…Here, it has led the parties to wildly dissimilar understandings of AAPC’s legal effect — all in the utmost good faith and preparation. In the future, it may engender a circuit split which confronts the court anew.”

2020 has been an important year for TCPA developments, and 2021 is likely to be much of the same. Organizations are advised to review and update their telemarketing and/or automatic dialing practices to ensure TCPA compliance.

In case you missed it, here are several other TCPA updates of late worth reviewing:

Already at the cutting edge of U.S. privacy law, California jumped even further ahead of the pack with the recent approval by State voters of the California Privacy Rights Act (“CPRA”).  The CPRA, which builds upon the already extensive framework of privacy rights and obligations established in the California Consumer Privacy Act (“CCPA”), is likely to be met with weariness by many subject organizations, which have, over the past couple years, invested significant effort and resources to come into compliance with the CCPA.

Through this post, and those that follow in our CPRA Series, we will attempt to lessen that burden by identifying and discussing key features of the CPRA and how those features impact organizations’ existing CCPA compliance programs.

Notice At Collection

One important step subject organizations will need to take in response to the CPRA is to update their CCPA notices at collection.  Under the CCPA, an organization is required to provide to consumers – a category which includes employees, applicants, and contractors – a notice that discloses the categories of personal information the organization collects and the purposes for which it uses that information.

When the CPRA takes effect in January 2023, organizations will be required to augment their notices to include three additional categories of disclosure.  Specifically, they will need to:

  1. disclose whether they sell or share personal information;
  2. make disclosures related to their collection, processing, and disclosure of “sensitive personal information,” a new category of information created by the CPRA, which we further discuss below; and
  3. disclose the length of time they intend to retain each category of personal information, or, if that would not be feasible, the criteria they will use to determine that retention period.

Privacy Policy

The passage of the CPRA will also require subject organizations to revisit their privacy policies.  The CCPA requires organizations to develop and post online a privacy policy that informs consumers about the existence of, and provides guidance on how to exercise, their CCPA rights.  For instance, their right to know what personal information about them organizations collect, disclose, or sell; their right to request the deletion of that information; and their right to opt-out of its sale.

The CPRA modifies certain of the rights provided for in the CCPA, while also adding several that are novel.  Specifically, the CPRA:

  • enlarges the CCPA’s 12-month look-back period for requests to “know” (while affording organizations an opportunity to deny expanded requests if compliance would be “impossible” or “involve a disproportionate effort”);
  • adds to the CCPA-established right to opt-out of the sale of personal information a new right to opt-out of the sharing of that information;
  • requires organizations, in the event they receive a deletion request, to direct any service providers, third parties, and/or “contractors” (a new category created by the CPRA) to whom they sold the personal information at issue, or with whom they shared it, to delete that information;
  • creates a new category of personal information – “sensitive personal information” – and empowers consumers to direct organizations to limit their use of such information; and
  • grants consumers the new right to request that organizations correct inaccuracies in their personal information.

Prior to the effective date of the CPRA, organizations will need to update their notices at collection and privacy policies to address the new and modified rights it grants consumers.  For assistance with these updates, please reach out to a member of the Privacy, Data and Cybersecurity group or the Jackson Lewis attorney with whom you regularly work.