In June 2022, the California Privacy Protection Agency (CPPA) Board first started discussions about revising the regulations previously released by the California Attorney General.

In October, the Board released proposed modifications to the regulations in advance of a planned Board meeting. Since then, the Board has rescheduled both Board and public meetings.

The Board seems to be getting closer to a final vote on the regulations but recently published further modifications to regulations. These modifications start a new public comment period that ends November 21st.

Recent updates to the regulations include:

  • Sections clarifying how consumers can opt-out of having their data sold or shared, including via opt-out preference signals.
  • Provisions providing allowances for enforcement flexibility, which are intended to assuage businesses’ concerns that the current delay in adopting final regulations will present compliance challenges.
  • Allowances for businesses, service providers, and contractors to delay compliance with requests to correct archived or backup systems until the data is restored to an active system or is next accessed or used.

Businesses can submit comments regarding the current version of the regulations by:

  • E-mail to: regulations@cppa.ca.gov . Submissions should include “CPPA Public Comment” in the subject line and provide comments within an attachment.
  • Mail to: California Privacy Protection Agency
    Attn: Brian Soublet
    2101 Arena Blvd., Sacramento, CA 95834

Jackson Lewis will continue to track information related to privacy regulations and related issues. For additional information on the CPRA, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

Responding in part to the nature of the post-COVID-19 remote workplace, NLRB GC Jennifer Abruzzo has released a memo on employers’ use of electronic monitoring and automated management in the workplace. The memo also directs NLRB Regions to submit to the Division of Advice any cases involving intrusive or abusive electronic surveillance and algorithmic management that interferes with the exercise of NLRA Section 7 rights.

Read the full article on Jackson Lewis’ Labor & Collective Bargaining.

We have been quite busy this October, which happens to be National Cybersecurity Awareness Month. But, we did not want to let the month go by without some recognition; and we are grateful to the HHS Office for Civil Rights (OCR) for this always timely reminder for HIPAA covered entities and business associates – have a written incident response plan

Why do we need another policy?

First, because it is required under the HIPAA Security Rule. See 45 CFR 164.308(a)(6). Also, because cybersecurity risks continue to rise. The OCR notes that cybersecurity incidents and data breaches continue to increase in the healthcare sector, citing a 69% increase in cyber-attacks for the first half of 2022 compared to 2021. Breaches of unsecured protected health information (PHI), including electronic PHI, reported to OCR affecting 500 or more individuals increased from 663 in 2020 to 714 in 2021.

Fine, so what does an incident response plan need to include?

The OCR describes some basic elements that should be included in an incident response plan (IRP):

  • identifying security incidents;
  • responding to security incidents;
  • mitigating harmful effects of security incidents; and
  • documenting security incidents and their outcomes.

As we get more specific below, note that each covered entity and business associate is different in several respects, such as size, number of locations, information systems, prior experience, cyber insurance policies, type of PHI, and state laws, just to name a few. So, your specific IRP may vary in significant ways, but these are four critical elements to address for your particular business and practice.

Can you be more specific?

Sure. The organization will want to think about who will be doing the responding – who is on the “security incident response team.” This is a team that is organized and trained to effectively respond to security incidents. OCR offers several areas to consider when forming a team, such as:

  • Have a strong balance of skill sets among team members (IT, legal, communications, etc.)
  • Ensure lines of communication will be available among team members during a crisis
  • Consider external parties that can provide specific expertise concerning incident response
  • Commit to regularly practicing incident response procedures for different types of attacks.

With a team established, the plan should provide for identifying security incidents. Of course, this requires knowing that a security incident is “the attempted or successful unauthorized access, use, disclosure modification, or destruction of information or interference with system operations in an information system .” One way to identify security incidents includes having audit logs in place and regularly reviewing them.

In the event of a security incident, the plan needs cover the steps for responding. This includes containing the security incident and any threat it may pose to ePHI, such as by identifying and removing any malicious code and mitigating any vulnerabilities that may have permitted the security incident to occur. However, to be better prepared to respond to security incidents, the plan should also include procedures such as:

  • Processes to identify and determine the scope of security incidents
  • Instructions for managing the security incident
  • Creating and maintaining a list of assets (computer systems and data) to prioritize when responding to a security incident
  • Conducting a forensic analysis to identify the extent and magnitude of the security incident
  • Reporting the security incident to appropriate internal and external entities
  • Processes for collecting and maintaining evidence of the security incident (e.g., log files, registry keys, and other artifacts) to determine what was accessed during the security incident

After the security incident has been neutralized, the next steps should include mitigation, including recovery and restoration of systems and data to return to normal operations. Mitigation efforts are facilitated through contingency planning, robust data backup, and recovery processes. These are areas that should not be thought about when a security incident occurs. For example, knowing that you have a backup is not enough, regularly making sure you are able to restore from backups while maintaining integrity is key. 

When these steps have been completed, particularly after operations have returned to normal, regulated entities must document their response to the security incident. This is required under HIPAA. The IRP can be helpful in outlining what information to include in the documentation (e.g., discovery of the security incident; systems and data affected; response and mitigation activities; recovery outcomes; root cause analysis; forensic data collected).

What about notification, shouldn’t that be part of the IRP?

Of course. The IRP should address the entity’s reporting obligations, whether to the affected individuals, the OCR, the media, state agencies, or a covered entity (for business associates). A critical aspect of notification is timing. For breaches affecting 500 or more individuals, notice is required without unreasonable delay and no later than 60 calendar days from the discovery of the breach. The OCR reminds regulated entities:

the time period [for reporting] begins when the incident is first known, not when the investigation of the incident is complete, even if it is initially unclear whether the incident constitutes a breach as defined in the rule. 

Further, 60 days is the outer limit for notification but,

in some cases, it may be an ‘unreasonable delay’ to wait until the 60th day to provide notification.

There is a lot more that can be said about IRPs, and it is not a good idea to wait until the next National Cybersecurity Awareness Month to craft one. Also, while directed to healthcare providers and their business associates, the same kind of planning is prudent for just about all organizations. 

Over the past several years, there has been a significant increase in the use of dashcam technology. The technology available in the market is quite advanced. As we observed here, these devices can be equipped with geolocation, AI, facial recognition, and other technologies.  Designed primarily to enhance driver safety and fleet management, privacy concerns are tapping the brakes on implementation in California.

On September 29, 2022, Governor Gavin Newsom signed into law AB-984, and becoming effective January 1, 2023. The law builds on other privacy protections in California, such as the California Consumer Privacy Act and Penal Code Sec. 637.7. Section 637.7 prohibits using an electronic tracking device to determine the location or movement of a person, however, it does not apply when the vehicle owner (e.g., the employer) has consented to the use of the device.

Among other exciting provisions, including the latest in vehicle tech – digital license plates, AB-984 places significant restrictions on the use of an alternative device to monitor employees. Specifically, the law provides:

An employer, or a person acting on behalf of the employer, shall not use an alternative device to monitor employees except during work hours, and only if strictly necessary for the performance of the employee’s duties.

The statute defines monitoring to include, without limitation, “locating, tracking, watching, listening to, or otherwise surveilling the employee.” However, there is no definition of “strictly necessary,” making the statute more difficult to navigate.

Employers that choose to install such a device must provide notice to employees prior to monitoring with the device. That notice must, at a minimum, include the following:

(A) A description of the specific activities that will be monitored.

(B) A description of the worker data that will be collected as a part of the monitoring.

(C) A notification of whether the data gathered through monitoring will be used to make or inform any employment-related decisions, including, but not limited to, disciplinary and termination decisions, and, if so, how, including any associated benchmarks.

(D) A description of the vendors or other third parties, if any, to which information collected through monitoring will be disclosed or transferred. The description shall include the name of the vendor or third party and the purpose for the data transfer.

(E) A description of the organizational positions that are authorized to access the data gathered through the alternative device.

(F) A description of the dates, times, and frequency that the monitoring will occur.

(G) A description of where the data will be stored and the length of time it will be retained.

(H) A notification of the employee’s right to disable monitoring, including vehicle location technology, outside of work hours.

Employers that fail to comply can be subject to significant penalties. A civil penalty of $250 can be imposed for an initial violation, while a $1,000 per employee can be imposed for each subsequent violation. The statute expressly provides that penalties “shall be assessed per employee, per violation, and per day that monitoring without proper notice is conducted.”

In addition to penalties, employer have additional exposure if found to have retaliated against an employee for removing or disabling an alternative device’s monitoring capabilities outside of work hours. In this case, the employee “shall be entitled to all available penalties, remedies, and compensation, including, but not limited to, reinstatement and reimbursement of lost wages, work benefits, or other compensation caused by the retaliation.”

For employers considering using an alternative device to monitor employees in vehicles, there are at least two steps to take:

  • Assess whether doing so is “strictly necessary” for the performance of the employee’s duties
  • Provide advance notice of the monitoring

There are several other issues to consider as well, just looking at the items required to be included in the notice.

On October 21 and 22, the California Privacy Protection Agency (CPPA) Board will meet to discuss possible action regarding the proposed regulations for the California Consumer Privacy Act (CCPA) and California Privacy Rights Act (CPRA).

Previously, in June 2022, the Board met to discuss revising the regulations previously released by the California Attorney General.  

In advance of the October CPPA Board meeting, further proposed modifications to the regulations have been published, along with an explanation of the proposed changes.

Some of the more significant changes include:

  • Revised Section 7002 regarding the “Restrictions on the Collection and Use of Personal Information” to clarify specific requirements. The revision sets forth factors to be considered in evaluating the collection and use including: (1) the reasonable expectations of a consumer concerning the purpose for which personal information is collected or processed, (2) the purposes that are compatible with the context in which the personal information is collected, and (3) whether collecting or processing personal information is reasonably necessary and proportionate to achieve those purposes.
  • Revised Section 7004 regarding the “Requirements for Methods for Submitting CCPA Requests and Obtaining Consumer Consent” to explain how different user interfaces can impair or interfere with consumers’ choice and can fail to meet the definition of consent under the Civil Code.
  • Also revised Section 7004 (a)(2) to clarify that the symmetry in choice principle also considers whether different paths are more difficult or time-consuming.
  • Revised Section 7052 regarding “Third Parties” to clarify that third parties are contractually required to treat the personal information that businesses make available to them, in the same manner, the business is required to treat it under the CCPA.

It is possible that at the October meeting, the CPPA could elect to adopt the modified regulations or choose to make further changes.

Jackson Lewis will continue to track information related to privacy regulations and related issues. For additional information on the CPRA, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

In July 2020, the Court of Justice of the European Union (CJEU) declared the EU-U.S. Privacy Shield invalid. The EU-U.S. Privacy Shield program was designed to provide European Economic Area (EEA) data transferred to the U.S. with a level of protection comparable to EU law. The CJEU invalidated the program stating that U.S. companies could not provide an essentially equivalent level of protection based on the breadth of U.S. national security surveillance laws, FISA 702, E.O. 12.333, and PPD 28. In the wake of the decision, businesses relying on the EU-U.S. Privacy Shield as an adequate transfer mechanism to perform routine activities such as sending employee data from the EEA to U.S. headquarters for HR administration, accessing a global HR database from the U.S., remotely accessing EEA user accounts from the U.S. for IT services, providing EEA data to third party vendors for processing in the U.S., or relying on certain cloud-based services were forced to rely on alternate mechanisms including standard contractual clauses.

On October 7, 2022, President Biden signed an Executive Order that outlines steps the U.S. government will take to implement a new EU-U.S. data privacy framework, the Trans-Atlantic  Data Privacy Framework, to replace the invalidated EU-U.S. Privacy Shield.

The new Framework is designed to restore a legal basis for transatlantic data flows and addresses concerns raised in the CJEU decisionby strengthening privacy and civil liberties protections for foreign individuals and creating an independent and binding process for non-U.S. citizens to seek redress if they believe their personal data was improperly collected through U.S. signals intelligence. Signals intelligence activities involve collecting foreign intelligence from communications and information systems. 

The Executive Order is the first step toward rebuilding the EU-U.S. data protection program. Over the next few months, the EU Commission will review the framework and if satisfied with the proposed safeguards and protections for EU data and individuals, issue an “adequacy decision” that concludes data transferred to the U.S. will receive an essentially equivalent level of protection. While legal challenges to this new framework are anticipated, the Executive Order demonstrates a U.S. commitment to addressing EU concerns regarding data protection. It also provides an incentive to U.S. organizations to maintain their EU-US Privacy Shield certification in hopes it can be leveraged under the new framework.

If you have questions about the effect of the Executive Order on your business or related issues contact the Jackson Lewis attorney with whom you regularly work or a member of our Privacy, Data, and Cybersecurity practice group.

On October 3, 2022, the White House Office of Science and Technology Policy published its “Blueprint for an AI Bill of Rights.” This adds to prior federal guidance released by the EEOC and DOJ regarding the use of AI in employment decisions.

The framework published by the White House is intended to apply to automated systems that have an impact on individuals’ “rights, opportunities, or access to critical resources or services.”

The blueprint sets forth five protections to which individuals should be entitled:

  • Safe and effective systems
  • Protection from algorithmic discrimination
  • Data Privacy
  • Notice and explain when an automated system is being used and how it impacts the individual
  • Ability to opt out of automated systems and have access to people who can remedy issues

The framework is intended to assist in putting guardrails in place in the use of AI and automated systems. In conjunction with the publishing of the blueprint, the Biden-Harris Administration announced actions across the federal government to advance protections for workers and employers, students, patients, and more.

These initiatives include the Department of Labor’s release of “What the Blueprint for AI Bill of Rights Means for Workers” and its ramping of enforcement of required surveillance reporting to protect worker organizing. There are also consumer protections noted such as the Federal Trade Commission’s recent consideration of rulemaking on consumer privacy and data. And many others related to education and health care.

The Administration’s announcement is consistent with steps taken during the Trump Administration. It also is generally consistent with principles for AI established by the Organization for Economic Cooperation and Development (OECD). The OECD is a global organization established in 1961 to promote economic cooperation and development with nearly 40 members, including the United States. In 2019, U.S. National Telecommunications and Information Administration joined OECD in adopting global AI principles. Among other things, the OECD’s Principles on Artificial Intelligence provide that AI actors should:

“respect the rule of law, human rights, and democratic values, throughout the AI system lifecycle. These include freedom, dignity, and autonomy, privacy and data protection, nondiscrimination and equality, diversity, fairness, social justice, and internationally recognized labour rights.”

“provide meaningful information… (i) to foster a general understanding of AI systems, (ii) to make stakeholders aware of their interactions with AI systems, including in the workplace, (iii) to enable those affected…to understand the outcome, and (iv) to enable those adversely affected… to challenge its outcome”

While this latest blueprint for use of AI is only guidance at this time, it signals the direction the federal government intends to take with future regulation and legislation when it comes to automated systems and related technology. And, it builds on a set of principles emerging globally that seek to ensure the appropriate use of AI, principles that we are seeing embedded in laws in the U.S. such as the law regulating “automated employment decision tools” going into effect in New York City in 2023.

Businesses and employers who use AI and automated systems need to consider the Administration’s guidance along with emerging laws, regulations, and principles to guide their adoption and application of AI. This includes developing policies and procedures that establish protections to avoid potential discrimination or breaches of privacy.

If you have questions about developing policies and procedures around the use of AI and automated systems contact the Jackson Lewis attorney with whom you regularly work or a member of our Privacy, Data, and Cybersecurity practice group.

California passed Assembly Bill (AB) 2089, which amends the Confidentiality of Medical Information Act (CMIA) to include mental health application information under the definition of medical information. Under the revisions to CMIA, mental health application information is defined as information related to a consumer’s inferred or diagnosed mental health or substance use disorder, as defined in Section 1374.72 of the Health and Safety Code, collected by a mental health digital service.  Similarly, “mental health digital service” is defined as a mobile-based application or internet website that collects mental health application information from a consumer, markets itself as facilitating mental health services to a consumer, and uses the information to facilitate mental health services to a consumer.

Under AB 2089 any business that offers a mental health digital service to a consumer for the purpose of allowing the individual to manage the individual’s information, or for the diagnosis, treatment, or management of a medical condition of the individual, is deemed to be a provider of health care subject to the requirements of the CMIA.

Moreover, the bill requires any business that offers a mental health digital service, when partnering with a provider of health care, to provide to the health care provider information regarding how to find data breaches reported, as specified, on the internet website of the Attorney General.

If you have questions about AB 2089 or related issues contact the Jackson Lewis attorney with whom you regularly work or reach out to a member of our Privacy, Data, and Cybersecurity practice group

1. What’s changing?

Under the current version of the California Consumer Privacy Act (“CCPA”), an employer’s obligations related to the personal information it collects from employees, applicants, and contractors residing in California (collectively, “Employment Information”) are relatively limited.  Specifically, it needs to (1) provide those individuals a “notice at collection” that discloses the categories of personal information the employer collects about them and the purposes for which that information is used, and (2) safeguard those individuals’ personal information against unauthorized access or acquisition.

Come January 1, 2023, however, those obligations will dramatically expand when California’s new comprehensive privacy law, the California Privacy Rights Act (“CPRA”), which amends the CCPA, takes effect. 

2. How will Employment Information be treated after January 1, 2023?

Subject to any regulatory updates, Employment Information will be treated like commercial consumer information.

3. What are we required to do by January 1, 2023?

With respect to Employment Information, the core requirements of the CCPA will be as follows:

  • At or before the collection of Employment Information, provide employees, applicants, and contractors a notice at collection, disclosing the categories of Employment Information you collect, the purposes for which that information is used, and certain record retention information.
  • Provide employees, applicants, and contractors a privacy policy that discloses, in addition to the notice at collection of information, the sources from which you collect Employment Information; the parties to which, and purposes for which, you disclose that information, and the rights granted to employees, applicants, and contractors by the CCPA (e.g., the right to access, correct, and/or delete personal information).
  • Develop policies, procedures, and forms to process requests to access, correct, and/or delete personal information, and to avoid discriminating against individuals for exercising those rights.  This includes verifying the identities and authority of the persons making the requests, including third parties acting on their behalf.  
    • Train applicable staff on processing the above requests.
  • Determine whether you must extend the right to limit the use and disclosure of sensitive Employment Information. This will depend on your uses and disclosures of “sensitive personal information”, which is a narrow subset of personal information.
  • Identify service providers and contractors with access to Employment Information and ensure your contracts with those parties are CCPA-compliant.
  • While not a per se requirement, conducting a data mapping exercise is often critical to compliance with the obligations listed above.  Specifically, data mapping will help you identify, inter alia: what personal information you collect about employees, applicants, and contractors; the purposes for which you use that information; the sources of that information; the parties to which that information is disclosed, and for what purposes; and how long that information is retained.

4. What about the personal information of spouses and dependents?

Subject to any regulatory updates or clarifications, if the spouse or dependent is a California resident, their personal information would be subject to the same protections as Employment Information.

5. I keep seeing more “Do Not Sell My Personal Information” links on websites.  Does that requirement apply here?

We expect most employers will not be “selling” or “sharing” Employment Information, as those terms are defined under the CCPA.  However, it is prudent to analyze those definitions – in particular, for selling – to be sure.

6. January 1, 2023, is really soon.  We don’t have time for all of that.  Where should we focus our attention?!?

Full compliance with the CCPA will be a heavy lift for employers.  Those looking to triage in advance of the effective date can prioritize these relatively manageable action items:

  • Develop a working draft of your privacy policy (which would include an updated notice at collection)
  • Ensure your service provider and contractor agreements are compliant
  • Implement a preliminary framework for processing requests to access, correct, and/or delete personal information
  • Start the data mapping process

7. Is there a chance the California legislature could change this?

The California legislature reconvenes in January 2023 and, yes, it is possible it could pass a law that would revert to the rules for Employment Information described in Question 1 above or eliminate the CCPA’s application to Employment Information entirely.  By that point, however, the changes described above will already be in effect (although there is an enforcement grace period through July 1, 2023).  Waiting and hoping the California legislature jumps in to save employers is a risky strategy. 

If you have questions about compliance requirements under CCPA/CPRA please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

California’s Governor signed Assembly Bill (AB) 2273, the first of its kind state legislation that requires businesses that provide online services, products, or features likely to be accessed by children to comply with specified standards.

Building on federal protections for children online under the Children’s Online Privacy Protection Act (COPPA), AB 2273 enacts the California Age-Appropriate Design Code Act, which starting on July 1, 2024, would require a business that provides an online service, product, or feature likely to be accessed by children to comply with a significant number of specified requirements. For example, under the Act, such businesses must:

  • Configure all default privacy settings offered by the online service, product, or feature to the settings that offer a high level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interests of children;
  • Provide privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of children likely to access that online service, product, or feature;
  • Provide an obvious signal to the child when the child is being monitored or tracked when the online service, product, or feature allows the child’s parent, guardian, or any other consumer to monitor the child’s online activity or track the child’s location;
  • Provide prominent, accessible, and responsive tools to help children, or if applicable their parents or guardians, exercise their privacy rights and report concerns;
  • Not (i) use the personal information of any child in a way that the business knows, or has reason to know, is materially detrimental to the physical health, mental health, or well-being of a child; (ii) profile a child by default unless certain criteria are satisfied; or (iii) collect, sell, share, or retain any personal information that is not necessary to provide an online service, product, or feature with which a child is actively and knowingly engaged, or according to certain legal requirement unless the business can demonstrate a compelling reason that the collecting, selling, sharing, or retaining of the personal information is in the best interests of children likely to access the online service, product, or feature.

AB 2273 requires a business, before any new online services, products, or features are offered to the public, to complete a Data Protection Impact Assessment for any online service, product, or feature likely to be accessed by children and maintain documentation of this assessment as long as the online service, product, or feature is likely to be accessed by children. The Impact Assessment must address several aspects of the online service, product, or feature, such as

  • Whether its design could harm children, including by exposing children to harmful, or potentially harmful, content on the online product, service, or feature.
  • Whether its design could lead to children experiencing or being targeted by harmful, or potentially harmful, contacts.
  • Whether algorithms used could harm children.
  • Whether the targeted advertising systems used could harm children.

Moreover, a business would need to make a Data Protection Impact Assessment available, within 5 business days, to the Attorney General pursuant to a written request. The bill also exempts a Data Protection Impact Assessment from public disclosure

AB 2273 also authorizes the Attorney General to seek an injunction or civil penalty against any business that violates its provisions. The bill would hold violators liable for a civil penalty of not more than $2,500 per affected child for each negligent violation or not more than $7,500 per affected child for each intentional violation.

If you have questions about AB 2273 or related issues contact the Jackson Lewis attorney with whom you regularly work or reach out to a member of our Privacy, Data, and Cybersecurity practice group