In July 2020, the Court of Justice of the European Union (CJEU) declared the EU-U.S. Privacy Shield invalid. The EU-U.S. Privacy Shield program was designed to provide European Economic Area (EEA) data transferred to the U.S. with a level of protection comparable to EU law. The CJEU invalidated the program stating that U.S. companies could not provide an essentially equivalent level of protection based on the breadth of U.S. national security surveillance laws, FISA 702, E.O. 12.333, and PPD 28. In the wake of the decision, businesses relying on the EU-U.S. Privacy Shield as an adequate transfer mechanism to perform routine activities such as sending employee data from the EEA to U.S. headquarters for HR administration, accessing a global HR database from the U.S., remotely accessing EEA user accounts from the U.S. for IT services, providing EEA data to third party vendors for processing in the U.S., or relying on certain cloud-based services were forced to rely on alternate mechanisms including standard contractual clauses.

On October 7, 2022, President Biden signed an Executive Order that outlines steps the U.S. government will take to implement a new EU-U.S. data privacy framework, the Trans-Atlantic  Data Privacy Framework, to replace the invalidated EU-U.S. Privacy Shield.

The new Framework is designed to restore a legal basis for transatlantic data flows and addresses concerns raised in the CJEU decisionby strengthening privacy and civil liberties protections for foreign individuals and creating an independent and binding process for non-U.S. citizens to seek redress if they believe their personal data was improperly collected through U.S. signals intelligence. Signals intelligence activities involve collecting foreign intelligence from communications and information systems. 

The Executive Order is the first step toward rebuilding the EU-U.S. data protection program. Over the next few months, the EU Commission will review the framework and if satisfied with the proposed safeguards and protections for EU data and individuals, issue an “adequacy decision” that concludes data transferred to the U.S. will receive an essentially equivalent level of protection. While legal challenges to this new framework are anticipated, the Executive Order demonstrates a U.S. commitment to addressing EU concerns regarding data protection. It also provides an incentive to U.S. organizations to maintain their EU-US Privacy Shield certification in hopes it can be leveraged under the new framework.

If you have questions about the effect of the Executive Order on your business or related issues contact the Jackson Lewis attorney with whom you regularly work or a member of our Privacy, Data, and Cybersecurity practice group.

On October 3, 2022, the White House Office of Science and Technology Policy published its “Blueprint for an AI Bill of Rights.” This adds to prior federal guidance released by the EEOC and DOJ regarding the use of AI in employment decisions.

The framework published by the White House is intended to apply to automated systems that have an impact on individuals’ “rights, opportunities, or access to critical resources or services.”

The blueprint sets forth five protections to which individuals should be entitled:

  • Safe and effective systems
  • Protection from algorithmic discrimination
  • Data Privacy
  • Notice and explain when an automated system is being used and how it impacts the individual
  • Ability to opt out of automated systems and have access to people who can remedy issues

The framework is intended to assist in putting guardrails in place in the use of AI and automated systems. In conjunction with the publishing of the blueprint, the Biden-Harris Administration announced actions across the federal government to advance protections for workers and employers, students, patients, and more.

These initiatives include the Department of Labor’s release of “What the Blueprint for AI Bill of Rights Means for Workers” and its ramping of enforcement of required surveillance reporting to protect worker organizing. There are also consumer protections noted such as the Federal Trade Commission’s recent consideration of rulemaking on consumer privacy and data. And many others related to education and health care.

The Administration’s announcement is consistent with steps taken during the Trump Administration. It also is generally consistent with principles for AI established by the Organization for Economic Cooperation and Development (OECD). The OECD is a global organization established in 1961 to promote economic cooperation and development with nearly 40 members, including the United States. In 2019, U.S. National Telecommunications and Information Administration joined OECD in adopting global AI principles. Among other things, the OECD’s Principles on Artificial Intelligence provide that AI actors should:

“respect the rule of law, human rights, and democratic values, throughout the AI system lifecycle. These include freedom, dignity, and autonomy, privacy and data protection, nondiscrimination and equality, diversity, fairness, social justice, and internationally recognized labour rights.”

“provide meaningful information… (i) to foster a general understanding of AI systems, (ii) to make stakeholders aware of their interactions with AI systems, including in the workplace, (iii) to enable those affected…to understand the outcome, and (iv) to enable those adversely affected… to challenge its outcome”

While this latest blueprint for use of AI is only guidance at this time, it signals the direction the federal government intends to take with future regulation and legislation when it comes to automated systems and related technology. And, it builds on a set of principles emerging globally that seek to ensure the appropriate use of AI, principles that we are seeing embedded in laws in the U.S. such as the law regulating “automated employment decision tools” going into effect in New York City in 2023.

Businesses and employers who use AI and automated systems need to consider the Administration’s guidance along with emerging laws, regulations, and principles to guide their adoption and application of AI. This includes developing policies and procedures that establish protections to avoid potential discrimination or breaches of privacy.

If you have questions about developing policies and procedures around the use of AI and automated systems contact the Jackson Lewis attorney with whom you regularly work or a member of our Privacy, Data, and Cybersecurity practice group.

California passed Assembly Bill (AB) 2089, which amends the Confidentiality of Medical Information Act (CMIA) to include mental health application information under the definition of medical information. Under the revisions to CMIA, mental health application information is defined as information related to a consumer’s inferred or diagnosed mental health or substance use disorder, as defined in Section 1374.72 of the Health and Safety Code, collected by a mental health digital service.  Similarly, “mental health digital service” is defined as a mobile-based application or internet website that collects mental health application information from a consumer, markets itself as facilitating mental health services to a consumer, and uses the information to facilitate mental health services to a consumer.

Under AB 2089 any business that offers a mental health digital service to a consumer for the purpose of allowing the individual to manage the individual’s information, or for the diagnosis, treatment, or management of a medical condition of the individual, is deemed to be a provider of health care subject to the requirements of the CMIA.

Moreover, the bill requires any business that offers a mental health digital service, when partnering with a provider of health care, to provide to the health care provider information regarding how to find data breaches reported, as specified, on the internet website of the Attorney General.

If you have questions about AB 2089 or related issues contact the Jackson Lewis attorney with whom you regularly work or reach out to a member of our Privacy, Data, and Cybersecurity practice group

1. What’s changing?

Under the current version of the California Consumer Privacy Act (“CCPA”), an employer’s obligations related to the personal information it collects from employees, applicants, and contractors residing in California (collectively, “Employment Information”) are relatively limited.  Specifically, it needs to (1) provide those individuals a “notice at collection” that discloses the categories of personal information the employer collects about them and the purposes for which that information is used, and (2) safeguard those individuals’ personal information against unauthorized access or acquisition.

Come January 1, 2023, however, those obligations will dramatically expand when California’s new comprehensive privacy law, the California Privacy Rights Act (“CPRA”), which amends the CCPA, takes effect. 

2. How will Employment Information be treated after January 1, 2023?

Subject to any regulatory updates, Employment Information will be treated like commercial consumer information.

3. What are we required to do by January 1, 2023?

With respect to Employment Information, the core requirements of the CCPA will be as follows:

  • At or before the collection of Employment Information, provide employees, applicants, and contractors a notice at collection, disclosing the categories of Employment Information you collect, the purposes for which that information is used, and certain record retention information.
  • Provide employees, applicants, and contractors a privacy policy that discloses, in addition to the notice at collection of information, the sources from which you collect Employment Information; the parties to which, and purposes for which, you disclose that information, and the rights granted to employees, applicants, and contractors by the CCPA (e.g., the right to access, correct, and/or delete personal information).
  • Develop policies, procedures, and forms to process requests to access, correct, and/or delete personal information, and to avoid discriminating against individuals for exercising those rights.  This includes verifying the identities and authority of the persons making the requests, including third parties acting on their behalf.  
    • Train applicable staff on processing the above requests.
  • Determine whether you must extend the right to limit the use and disclosure of sensitive Employment Information. This will depend on your uses and disclosures of “sensitive personal information”, which is a narrow subset of personal information.
  • Identify service providers and contractors with access to Employment Information and ensure your contracts with those parties are CCPA-compliant.
  • While not a per se requirement, conducting a data mapping exercise is often critical to compliance with the obligations listed above.  Specifically, data mapping will help you identify, inter alia: what personal information you collect about employees, applicants, and contractors; the purposes for which you use that information; the sources of that information; the parties to which that information is disclosed, and for what purposes; and how long that information is retained.

4. What about the personal information of spouses and dependents?

Subject to any regulatory updates or clarifications, if the spouse or dependent is a California resident, their personal information would be subject to the same protections as Employment Information.

5. I keep seeing more “Do Not Sell My Personal Information” links on websites.  Does that requirement apply here?

We expect most employers will not be “selling” or “sharing” Employment Information, as those terms are defined under the CCPA.  However, it is prudent to analyze those definitions – in particular, for selling – to be sure.

6. January 1, 2023, is really soon.  We don’t have time for all of that.  Where should we focus our attention?!?

Full compliance with the CCPA will be a heavy lift for employers.  Those looking to triage in advance of the effective date can prioritize these relatively manageable action items:

  • Develop a working draft of your privacy policy (which would include an updated notice at collection)
  • Ensure your service provider and contractor agreements are compliant
  • Implement a preliminary framework for processing requests to access, correct, and/or delete personal information
  • Start the data mapping process

7. Is there a chance the California legislature could change this?

The California legislature reconvenes in January 2023 and, yes, it is possible it could pass a law that would revert to the rules for Employment Information described in Question 1 above or eliminate the CCPA’s application to Employment Information entirely.  By that point, however, the changes described above will already be in effect (although there is an enforcement grace period through July 1, 2023).  Waiting and hoping the California legislature jumps in to save employers is a risky strategy. 

If you have questions about compliance requirements under CCPA/CPRA please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

California’s Governor signed Assembly Bill (AB) 2273, the first of its kind state legislation that requires businesses that provide online services, products, or features likely to be accessed by children to comply with specified standards.

Building on federal protections for children online under the Children’s Online Privacy Protection Act (COPPA), AB 2273 enacts the California Age-Appropriate Design Code Act, which starting on July 1, 2024, would require a business that provides an online service, product, or feature likely to be accessed by children to comply with a significant number of specified requirements. For example, under the Act, such businesses must:

  • Configure all default privacy settings offered by the online service, product, or feature to the settings that offer a high level of privacy, unless the business can demonstrate a compelling reason that a different setting is in the best interests of children;
  • Provide privacy information, terms of service, policies, and community standards concisely, prominently, and using clear language suited to the age of children likely to access that online service, product, or feature;
  • Provide an obvious signal to the child when the child is being monitored or tracked when the online service, product, or feature allows the child’s parent, guardian, or any other consumer to monitor the child’s online activity or track the child’s location;
  • Provide prominent, accessible, and responsive tools to help children, or if applicable their parents or guardians, exercise their privacy rights and report concerns;
  • Not (i) use the personal information of any child in a way that the business knows, or has reason to know, is materially detrimental to the physical health, mental health, or well-being of a child; (ii) profile a child by default unless certain criteria are satisfied; or (iii) collect, sell, share, or retain any personal information that is not necessary to provide an online service, product, or feature with which a child is actively and knowingly engaged, or according to certain legal requirement unless the business can demonstrate a compelling reason that the collecting, selling, sharing, or retaining of the personal information is in the best interests of children likely to access the online service, product, or feature.

AB 2273 requires a business, before any new online services, products, or features are offered to the public, to complete a Data Protection Impact Assessment for any online service, product, or feature likely to be accessed by children and maintain documentation of this assessment as long as the online service, product, or feature is likely to be accessed by children. The Impact Assessment must address several aspects of the online service, product, or feature, such as

  • Whether its design could harm children, including by exposing children to harmful, or potentially harmful, content on the online product, service, or feature.
  • Whether its design could lead to children experiencing or being targeted by harmful, or potentially harmful, contacts.
  • Whether algorithms used could harm children.
  • Whether the targeted advertising systems used could harm children.

Moreover, a business would need to make a Data Protection Impact Assessment available, within 5 business days, to the Attorney General pursuant to a written request. The bill also exempts a Data Protection Impact Assessment from public disclosure

AB 2273 also authorizes the Attorney General to seek an injunction or civil penalty against any business that violates its provisions. The bill would hold violators liable for a civil penalty of not more than $2,500 per affected child for each negligent violation or not more than $7,500 per affected child for each intentional violation.

If you have questions about AB 2273 or related issues contact the Jackson Lewis attorney with whom you regularly work or reach out to a member of our Privacy, Data, and Cybersecurity practice group

August 24, 2022, marked a milestone for the California Consumer Privacy Act (CCPA), the California Attorney General announced the first enforcement and settlement against beauty retailer Sephora.

Since July 2022, the California Attorney General’s (AG) office conducted an investigative sweep of online retailers to check compliance with the CCPA and sent out over 100 notices of alleged CCPA violations. The notices provided a 30-day period for businesses to correct alleged violations before an enforcement measure is taken. Attorney General Rob Bonta stated that after the notices, the “vast majority” of businesses changed their practices to comply with the CCPA.

The State alleged that Sephora violated the CCPA by failing to disclose to consumers it was selling their personal information, failed to process user requests to opt out of sale via user-enabled global privacy controls, and that the company did not cure these violations within the 30-day period of notice. Specifically, the State alleged that Sephora failed to notify its consumers that it had arrangements with third-parties (such as market research firms) where Sephora allowed them to install tracking software on its website and app so that third-parties could monitor consumers as they shopped. Under the terms of the settlement, “sale” included “sale using online tracking technology” which was broadly defined as where a business discloses or makes available consumers’ personal information to third parties through the use of online tracking technologies such as pixels, web beacons, software developer kits, third party library, and cookies in exchange for monetary or other valuable consideration, including personal information or other information such as analytics or free or discounted services. Meaning the idea of “sale” was broader than simply selling information to a third party in exchange for money.

The State considered Sephora’s arrangement with these third-parties a “sale” of consumer information under the CCPA. In short, the State alleged that: “Sephora did not tell consumers that it sold their personal information; instead, Sephora did the opposite, telling California consumers on its website that ‘we do not sell personal information.’”

The State and Sephora have reached a settlement that includes $1.2 million in penalties and as well as injunctive terms including:

  • Allow for consumers to opt-out of the sale of personal info, including via Global Privacy Control
  • Clarify its online disclosures and privacy policy
  • Conform its service provider agreements to the CCPA
  • Provide reports to the Attorney General relating to its sale of personal information

On January 1, 2023, the California Privacy Rights Act (CPRA) takes effect and amends the CCPA to eliminate the cure period and instead only allow the California Privacy Protection Agency (CPPA) discretion to provide time to cure.

In light of the State’s push toward enforcement and the rapidly approaching effective date of the CPRA, businesses must review their compliance efforts with the CCPA and CPRA. If you need assistance with compliance contact a Jackson Lewis attorney or the CCPA Team.

For the past few years, California’s comprehensive privacy law known as the California Consumer Privacy Act (“CCPA”) included an important partial exemption for employees, applicants, and independent contractors (collectively, “workforce members”). The California Privacy Rights Act, which amended the CCPA, extended the exemption through December 31, 2022. While many expected the exemption would be extended, the current California legislative session ended on August 31, 2022, without a bill to do so.

The failure to get an extension across the legislative finish line leaves CCPA-covered businesses with not much time to begin expanding their CCPA compliance efforts. Currently, compliance with respect to workforce members, and certain others, is limited. It includes, in general, providing a notice at or before the time of collection of personal information and maintaining reasonable safeguards to protect certain personal information. By comparison, employers will need to, among other things, expand their privacy policy to address workforce members and be ready to respond to the requests of workforce members concerning their rights under the CCPA, including the right to delete their personal information.

Another exemption, known by some as the “B2B” exemption, generally excluded the personal information of individuals in their capacities as representatives of entities doing business with CCPA-covered businesses. It appears that exemption also will cease to apply in California on January 1, 2023.

For employers wondering if this applies to them and what needs to be done next, our CCPA/CPRA FAQs provide some helpful information, addressing questions such as:

  • Which businesses does the CCPA/CPRA apply to?
  • What is personal information under the CCPA?
  • Does the CCPA apply to employee/applicant data?

Of course, the last question is modified by this development and we will be updating the FAQs accordingly, as well as for CPRA regulations, which currently are in proposed form.

Key steps for compliance will include, among other things:

  • Getting a better handle on the personal information collected, used, retained, and disclosed about workforce members,
  • Updating the business’ privacy policy,
  • Updating agreements with service providers, and
  • Training staff on responding to requests from workforce members concerning their privacy rights under the CCPA.

It is worth noting that the other four states with comprehensive privacy laws – Colorado, Connecticut, Utah, and Virginia – all have excluded the personal information of individuals when acting in an employment or commercial context.

If you have questions about compliance requirements under CCPA/CPRA please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

 

On August 17, 2022, New York announced an amendment to the Continuing Legal Education (CLE) Program Rules, which adds a requirement for attorneys to complete at least one CLE credit hour in Cybersecurity, Privacy, and Data Protection as part of fulfilling their CLE requirements.

New York barred attorneys will be required to comply starting July 1, 2023.

Subjects that will satisfy the new requirement will include:

  • Cyberthreats
  • Cyberattacks
  • Data breaches
  • Securing and protecting electronic data and communication
  • Appropriate cybersecurity and privacy policies and protocols
  • Compliance with professional and ethical obligations to protect confidential client and law firm data.

Even in-house counsel and law firms outside New York should consider training to ensure an understanding of the data privacy and security laws, as attacks against law firms have increased, ethical rules are tightening, and data privacy and security are becoming increasingly important to clients.

If you need assistance in training attorneys or other high-level employees regarding data privacy and security, contact the Jackson Lewis attorney with whom you regularly work or reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss our training capabilities.

A $300,640 settlement announced yesterday by the Office for Civil Rights (OCR) provides important reminders about HIPAA Privacy Rule and data privacy practices generally: robust data disposal practices are critical and “protected health information” (PHI) is not limited to diagnosis or particularly sensitive information.

The OCR’s settlement involved a New England dermatology practice that reported a HIPAA breach last year which resulted when empty specimen containers with PHI on the labels were placed in a garbage bin of the practice’s parking lot. The containers’ labels included patient names and dates of birth, dates of sample collection, and name of the provider who took the specimen. Accordingly to the Resolution Agreement, the practice

regularly discarded specimen containers with an attached label that contained PHI as regular waste, bagged and placed in an exterior dumpster…without alteration to the PHI containing label.

Data Disposal

The disposal practice described above may be more common that we think, and it raises risks well beyond HIPAA and PHI. The OCR announcement reminds covered entities and business associate of HIPAA FAQs addressing data disposal. Here are some key points from those FAQs:

  • Reasonable safeguards must be implemented to limit incidental, and avoid prohibited, uses and disclosures of PHI. This includes procedures for electronic PHI and/or the hardware or electronic media on which it is stored, as well as to removal of electronic PHI from electronic media before the media are made available for re-use.
  • Workforce members must be trained on and follow the disposal policies and procedures.
  • HIPAA does not specify a particular disposal method, but covered entities and business associates “are not permitted to simply abandon PHI or dispose of it in dumpsters or other containers that are accessible by the public or other unauthorized persons.” This includes paper records, labeled prescription bottles, hospital identification bracelets, PHI on electronic media, etc. Examples of disposal methods include:
    • Paper records with PHI: shred, burn, pulp, or pulverize the records so that PHI is rendered essentially unreadable, indecipherable, and otherwise cannot be reconstructed.
    • Maintain labeled prescription bottles and other PHI in opaque bags in a secure area and using a disposal vendor as a business associate to pick up and shred or otherwise destroy the PHI.
    • Electronic media with PHI: clear (using software or hardware products to overwrite media with non-sensitive data), purge (degaussing or exposing the media to a strong magnetic field in order to disrupt the recorded magnetic domains), or destroying the media (disintegration, pulverization, melting, incinerating, or shredding).

Of course, these best practices can be applied beyond HIPAA PHI to personal information as well as confidential company data.

Protected Health Information

A common and not necessarily unreasonable first reaction when considering the response to a potential data breach is that the compromised data is not PHI because it does not include diagnosis information. In cases like the one above, one might surmise that patient names, dates of birth, dates of sample collection, and name of provider who took the specimen are not PHI, or at least not sufficiently sensitive to warrant notification.

The definition of PHI starts with the definition of “individually identifiable health information,” which generally means identifiable health information transmitted or maintained in electronic media or any other form or medium that:

Relates to the past, present, or future physical or mental health or condition of an individual; the provision of health care to an individual; or the past, present, or future payment for the provision of health care to an individual.   

 See 45 CFR 160.103. See also 42 U.S.C. 1320d(6). This includes demographic information which likely includes information such as name, address and other contact information, age, gender, and insurance status.

When dealing with information of a personal nature, it is important to understand the different buckets into which that information may fall. It might not seem intuitive that certain categories of information, if compromised, could trigger a notification obligation.

 

For covered entities and business associates under HIPAA, and just about any other organization that handles confidential personal and business information, completely and securely disposing that information when it is no longer needed is an important step in limiting information risk. Additionally, it can be risky to make assumptions about the regulatory obligations concerning certain data without doing the homework or seeking experienced counsel.

On August 11, 2022, the Federal Trade Commission (FTC) announced proposed rulemaking pertaining to “commercial surveillance and lax data security.”  However, the overall focus of the potential rulemaking is consumer privacy and data security. The FTC states in its notice that its “extensive enforcement and policy work over the last couple of decades on consumer data privacy and security have raised important questions about the prevalence of harmful commercial surveillance and lax data security practices” and that this experience has suggested enforcement alone without rulemaking is not sufficient.

The agency defines “commercial surveillance” as the business of collecting, analyzing, and profiting from information about people.”

FTC Chair Lina M. Khan stated in the commission’s press release, “[o]ur goal today is to begin building a robust public record to inform whether the FTC should issue rules to address commercial surveillance and data security practices and what those rules should potentially look like.”

In a fact sheet released in conjunction with the notice of proposed rulemaking, the FTC identified issues in the “commercial surveillance industry” including the collection of consumer information, data security, harm to children, bias and discrimination, and dark patterns. Similar practices and concerns were recently addressed in both technical guidance issued by the Equal Employment Opportunity Commission (EEOC) and Department of Justice (DOJ), as well as pending federal legislation, the American Data Privacy and Protection Act (ADPPA).

During the press conference regarding the proposed rulemaking, the FTC stated support for the pending ADPPA and that it did not intend to overlap with coverage of that legislation should it pass.

The FTC will be hosting a public forum on commercial surveillance and data security virtually on September 8, 2022, from 2 pm until 7:30 p.m. The FTC will also be soliciting comments on the proposed rulemaking, though the link to submit comments is not yet available.

Jackson Lewis will continue to track the FTC’s proposed rulemaking and related guidance. If you have questions about the proposed rulemaking or FTC enforcement actions or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group.