In response to the United States Supreme Court decision in Dobbs vs. Jackson Women’s Health Organization, President Joe Biden signed an Executive Order on Friday, July 8, 2022, designed to protect access to reproductive health care services. In addition to measures seeking to safeguard access to abortion and contraception, the Executive Order includes provisions aimed at protecting the privacy of patients and their access to accurate information, which will likely build on guidance from the Secretary of Health and Human Services issued June 29, 2022, addressing related concerns.

When individuals think about the privacy and security of their health information moving through U.S. health care system, their first stop usually is the complex set of rules under “HIPAA” – referring to the Privacy, Security and related rules under the Health Insurance Portability and Accountability Act of 1996. For nearly 20 years, the HIPAA rules applicable to most healthcare providers and health plans have worked to safeguard “protected health information” or “PHI.” During that time, a debate has raged over the effectiveness of the rules; some arguing the rules are too stringent, others arguing they are not stringent enough, and still others believing HIPAA is just right.

Of course, the protection of medical information does not begin or end with HIPAA. There is a myriad of other federal, state, and local laws that potentially impact the privacy and security of individual identifiable medical information generated in connection with the provision and payment for reproductive health care services. Here, we address the Executive Order and recent OCR guidance. However, organizations must also consider these other laws when making decisions concerning the collection, use, disclosure, retention, and security of such information.

Executive Order.

With regard to protecting the privacy of patients and their access to accurate information, the Executive Order focuses on potential threats to patient privacy caused by (i) the transfer and sale of sensitive health-related data, and (ii) digital surveillance related to reproductive healthcare services. Related measures in the Order call for efforts to protect people seeking reproductive health services from fraudulent schemes or deceptive practices. To these ends, the Order directs:

  • the Secretary of Health and Human Services (HHS) to consider actions, including additional guidance under HIPAA, to strengthen the protection of sensitive information related to reproductive healthcare services and bolster patient-provider confidentiality. The Secretary also must work with the US Attorney General to consider actions designed to educate consumers on protecting privacy and limiting the collection and sharing of their sensitive health information.
  • the Chair of the Federal Trade Commission (FTC) to consider actions, including under the Federal Trade Commission Act, to protect consumers’ privacy when seeking information about and provision of reproductive healthcare services.
  • the Secretary to consult with the FTC Chair and Attorney General on ways to address deceptive and fraudulent practices related to reproductive healthcare services, including online, and to protect access to accurate information.

It remains to be seen what steps these agencies will take in response to the Executive Order. As noted above and summarized below, the Secretary has already issued guidance concerning patient privacy following Dobbs.

OCR Guidance Regarding Patient Privacy Following Dobbs.

Prior to the President’s Executive Order, the HHS Office for Civil Rights issued post-Dobbs guidance to help protect patients seeking reproductive care. The guidance comes in the form of reminders to providers and patients:

  • Reminder to providers about disclosures to third parties. In short, this guidance reminds HIPAA covered entities and business associates that they can use and disclose PHI, without an individual’s signed authorization, only as expressly permitted or required by the Privacy Rule. It reiterates some of the HIPAA Privacy Rule’s existing restrictions on disclosures of PHI (i) when required by law, (ii) for law enforcement purposes, and (iii) to avert a serious threat to health or safety. For example, the guidance makes clear that the HIPAA Privacy Rule permits but does not require covered entities to disclose PHI about an individual, without the individual’s authorization, when such disclosure is required by another law and the disclosure complies with the requirements of the other law. Further, the guidance reminds covered entities and business associates that the permission to disclose as “required by law” requires a mandate in the law that compels disclosure which is enforceable in court, explained through the following example:

An individual goes to a hospital emergency department while experiencing complications related to a miscarriage during the tenth week of pregnancy. A hospital workforce member suspects the individual of having taken medication to end their pregnancy. State or other law prohibits abortion after six weeks of pregnancy but does not require the hospital to report individuals to law enforcement. Where state law does not expressly require such reporting, the Privacy Rule would not permit a disclosure to law enforcement under the “required by law” permission. Therefore, such a disclosure would be impermissible and constitute a breach of unsecured PHI requiring notification to HHS and the individual affected. (emphasis added)

The guidance includes a similar analysis when considering law enforcement requests made through legal processes such as court orders or subpoenas.

  • Reminders to patients to protect medical information when using period trackers and other health information apps. In general, PHI accessed or stored on individuals’ personal devices is not protected under the HIPAA rules. The OCR cites recent reports about patients expressing concerns that period trackers and other health information apps threaten privacy by disclosing geolocation data which may be misused by those seeking to deny care.

To help address these concerns, the guidance provides steps to limit how certain devices collect and share health and other personal information without the knowledge of the device’s owner. This includes instructions for turning off location services and best practices for selecting apps, browsers, and search engines. It also provides a list of several resources for protecting privacy when using apps and other electronic products, including from the FTC and Consumer Reports.


What all this means for healthcare providers, health plans, and business associates is heightened attention when handling individual identifiable health information related to reproductive health care services, including when it is permissible to disclose HIPAA protected health information, particularly without the authorization of the individual to whom it relates. Such organizations also will need to consider more stringent state law that may provide stronger protections for privacy, while health plans covered by the Employee Retirement Income Security Act will have to assess whether state laws might be preempted by ERISA. These are not easy tasks in a world with growing privacy protections, data breaches, labor shortages, and rapidly advancing technologies.

At the start of June, the California Privacy Protection Agency (CPPA), the agency tasked with implementing and enforcing the California Privacy Rights Act (CPRA) which amended the California Consumer Privacy Act (CCPA), voted to begin the rulemaking process.

On July 8, 2022, the CPPA officially began the formal rule-making process to adopt proposed regulations implementing the CPRA by releasing the notice of proposed rulemaking. The CPPA stated the proposed regulations are intended to: “(1) update existing CCPA regulations to harmonize them with CPRA amendments to the CCPA; (2) operationalize new rights and concepts introduced by the CPRA to provide clarity and specificity to implement the law; and (3) reorganize and consolidate requirements set forth in the law to make the regulations easier to follow and understand.”

The notice also indicates that the CPPA will not be promulgating rules on cybersecurity audits or automated decision-making technology at this time.

A hearing on the proposed regulations is scheduled to occur on August 24 and 25, 2022. Written comments on the proposed regulations must be submitted in advance of the public hearing on August 23, 2022. Comments can be submitted by email to or by mail to The California Privacy Protection Agency, Attn: Brian Soublet, 2101 Arena Blvd., Sacramento, CA 95834.

Jackson Lewis will continue to track information related to privacy regulations and related issues. For additional information on the CPRA, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

Last month, the Illinois Supreme Court heard oral argument in the closely watched case of Cothron v. White Castle System Inc., and is set to decide when claims under Sections 15(b) and 15(d) of the Illinois Biometric Information Privacy Act accrue.

The court’s forthcoming decision in Cothron is likely to have a significant impact on Illinois employers who are facing BIPA litigation, or who use, or have used, biometric technology in the workplace.

Read Full Article at Law360

Subscription may be required to view article

The federal government has been trying to reach a consensus on data privacy and thus far has failed to pass legislation. On June 3, 2022, a bipartisan draft bill, titled the American Data Privacy and Protection Act was released by the Committee on Energy and Commerce. The bill intends to provide comprehensive data privacy legislation, including the development of a uniform, national data privacy framework and robust set of consumer privacy rights.

A covered entity for purposes of the draft bill is defined as “any entity or person that collects, processes, or transfers covered data” and is subject to the Federal Trade Commission Act, is a common carrier under the Communications Act of 1934, or is an organization not organized to carry on business for their own profit or that of their members.

Per the draft, the new act would be carried out by a new bureau within the Federal Trade Commission (FTC). Interestingly, the proposed legislation would preempt similar state laws, though excludes the CCPA/CPRA in California and the BIPA and the GIPA in Illinois from that preemption.

The draft bill covers a wide swath of data consumer privacy issues from data collection to civil rights and algorithms. The following are some highlights of note:

Data Collection Requirements

The draft legislation imposes a duty on all covered entities not to unnecessarily collect or use covered data with covered data being defined broadly as “information that identifies or is linked or reasonably linkable to an individual or a device that identifies or is linked or reasonably linkable to 1 or more individuals, including derived data and unique identifiers”.  The FTC would be charged with issuing additional guidance regarding what is reasonably necessary, proportionate, and limited for purposes of collecting data.

Covered entities would have a duty to implement reasonable policies, practices, and procedures for collecting processing, and transferring covered data. Further, covered entities would be required to provide individuals with privacy policies detailing data processing, transfer, and security activities in a readily available and understandable manner. The policies would need to include contact information, the affiliates of the covered entity that it transfers covered data to, and the purposes of each category of covered data the entitled collects, processes, and transfers.

Covered entities would be prohibited from conditioning or effectively conditioning the provision or termination of services or products to individuals by having individuals waive any privacy rights established under the law.

There would be additional executive responsibility for large data holders, including requiring CEOs and privacy officers to annually certify that their company maintains reasonable internal controls and reporting structures for compliance with the statute.

Individual Rights Created

Individuals would be granted the right to access, correct, delete, and portability of, covered data that pertains to them. These are similar to many of the rights California residents have under the CCPA/CPRA.  The right of access would include obtaining covered data in a human-readable and downloadable format that individuals can understand without expertise, the names of any other entities the data was transferred to, the categories of sources used to collect any covered data and the purposes for transferring the data.

Sensitive covered data, which includes items such as an individual’s health diagnosis, financial account information, biometric information, and government identifiers such as social security information, among other items, is prohibited from data collection without the individual’s affirmative consent.

Civil Rights and Algorithms

Unsurprisingly, algorithms, which were recently addressed by the EEOC and DOJ in guidance are also addressed in this draft legislation. Under the proposed legislation, covered entities may not collect, process, or transfer data in a manner that discriminates based on race, color, religion, national origin, gender, sexual orientation, or disability. This section of the law would require those large data holders that use algorithms to assess their algorithms annually and submit annual impact assessments to the FTC.

While comprehensive national privacy legislation has previously faced difficulties being passed, Jackson Lewis will continue to track the status of this legislation as it moves through Congress. If you have questions about this proposed legislation or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

At the California Privacy Protection Agency (CPPA) Board meeting on June 8, 2022, the board voted to begin the rulemaking process. The Board previously released a 66-page draft of regulations, that are intended to implement and interpret the California Consumer Privacy Act (CCPA) as amended by the California Privacy Rights Act (CPRA). While the draft redline regulations addressed topics such as implementing “easy to understand” language for consumer CCPA requests, the draft does not address all of the 22 regulatory topics required under the CPRA. For example, the draft does not cover the opt-in/opt-out of automated decision-making technology.

The next steps are for the board to file a Notice of Proposed Rulemaking Action, commencing the formal rulemaking process. The notice will be posted on the agency’s website and published in the California Regulatory Notice Register. Once filing opens a public comment period of at least 45 days will start, during which stakeholders and interested parties can submit written comments. The CPPA will also schedule a public hearing on the regulations.

Jackson Lewis will continue to track information related to privacy regulations and related issues. For additional information on the CPRA, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

On June 8, 2022, the California Privacy Protection Agency (CPPA) Board, will meet to discuss and take potential action regarding a draft of its proposed regulations. The June 8th public meeting includes an agenda item where the CPPA Board will consider “possible action regarding proposed regulations … including possible notice of proposed action.”

In advance of the meeting, the CPPA posted on its website draft redline regulations for discussion purposes on the issue of revising the current regulations released by the California Attorney General (recently renumbered by the CPPA). The quietly released 66-page draft regulations, are intended to implement and interpret the California Consumer Privacy Act (CCPA) as amended by the California Privacy Rights Act (CPRA). While the draft redline regulations address topics such as implementing “easy to understand” language for consumer CCPA requests, the draft does not address all of the 22 regulatory topics required under the CPRA. For example, the draft does not cover the opt-in/opt-out of automated decision making technology.

Here are some of the highlights of the proposed draft regulations:

  • Adds a definition of “disproportionate effort” within the context of responding to a consumer requests. For example, disproportionate effort might be involved when the personal information which is the subject of the request is not in a searchable or readily-accessible format, is maintained only for legal or compliance purposes, is not sold or used for any commercial purpose, and would not impact the consumer in any material manner;
  • Adds a new section on the restrictions on the collection and use of personal information that contains illustrative examples. One example is a business that offers a mobile flashlight app. That business would need the consumer’s explicit consent to collect a consumer geolocation information because that personal information is incompatible with the context in which the personal information is collected in connection with the app;
  • Adds requirements for disclosures and communications to consumers. This includes making sure communications are reasonably accessible to consumers with disabilities whether online or offline;
  • Adds requirements for methods for submitting CCPA requests and obtaining consumer consent. A key principle here is to ensure that the process for consumers to select a more privacy-protective options should not be more difficult or longer than a less protective option. Symmetry is the goal; and
  • Makes substantial revisions to the requirements for the privacy policy that a business is required to provide to consumers detailing the business’s online and offline practices regarding collection, use, sale, sharing, and retention of personal information. This includes new provisions concerning the right to limit the use and disclosure of sensitive personal information and the right to correct personal information.

To date, the Agency has not issued a Notice of Proposed Rulemaking to start the formal rulemaking process, but the timeframe associated with the draft regulations is still unclear – especially when the CPRA requires the CPPA to finalize regulations by July 1, 2022. It is expected that the June 8th meeting will provide details on the process.

Jackson Lewis will continue to track information related to privacy regulations and related issues. For additional information on the CPRA, please reach out to a member of our Privacy, Data, and Cybersecurity practice group

Organizations attacked with ransomware have a bevy of decisions to make, very quickly! One of those decisions is whether to pay the ransom. Earlier this year, I had the honor of contributing to a two-part series, entitled Ransomware: To pay or not to pay? (Part 1 and Part 2). Joined by Danielle Gardiner, CPA, CFF, SVP of Lowers Forensics International, and Shiraz Saeed, VP, Cyber Risk Product Leader at Arch Insurance Group, we explored a range of considerations that organizations need to weigh when making this consequential and potentially decision under very difficult circumstances. A new law in North Carolina makes this decision a lot easier for certain public sector entities in the state – they cannot pay.

North Carolina’s Current Operations Appropriations Act of 2021 added a new article to Chapter 143 of the State’s General Statutes which now reads in part:

No State agency or local government entity shall submit payment or otherwise communicate with an entity that has engaged in a cybersecurity incident on an information technology system by encrypting data and then subsequently offering to decrypt that data in exchange for a ransom payment.

If a state agency or local government entity experiences a “ransom request” in connection with a cybersecurity incident, it must consult with the state’s Department of Information Technology. These rules apply to the following entities:

  • State agency – Any agency, department, institution, board, commission, committee, division, bureau, officer, official, or other entity of the executive, judicial, or legislative branches of State government. The term includes The University of North Carolina and any other entity for which the State has oversight responsibility.
  • Local government entity – A local political subdivision of the State, including, but not limited to, a city, a county, a local school administrative unit as defined in G.S. 115C‑5, or a community college.

Double extortion ransomware attacks raise an interesting issue under this law. These attacks are more sinister because they do not just encrypt the victim’s system and demand payment in exchange for a decryption utility. They also exfiltrate data from the victim’s systems, threatening to disclose it on the dark web unless the attacker is paid ransom in exchange for the “promise” to delete and not disclose the data.

It is unclear if the North Carolina law reaches this second extortion in double extortion ransomware attacks, but its proscription is consistent with the Federal Bureau of Investigation’s position; the FBI does not support paying a ransom in response to a ransomware attack. But when the possibility of payment is on the table, organizations need to know that simply making the payment could put them into legal jeopardy.

As stated in Ransomware: To pay or not to pay? – Part 2:

The primary basis of this legal jeopardy is that under the International Emergency Economic Powers Act (IEEPA) and the Trading with the Enemy Act (TWEA) U.S. persons engaging in transactions with certain listed organizations can subject those persons to significant penalties. Specifically, the U.S. Department of Treasury’s Office of Foreign Assets Control (OFAC) maintains the Specially Designated Nationals and Blocked Persons List (SDN List), in addition to other blocked persons. A cryptocurrency transaction with one of these entities may result in the victim’s ability to retrieve access to its systems and data, but it could subject the organization to OFAC enforcement.

In its latest round of guidance on this issue, on October 1, 2020, OFAC issued the Advisory on Potential Sanctions Risks for Facilitating Ransomware Payments (Advisory). The Advisory makes clear that entities involved in facilitating a ransom payment may have done so in violation of OFAC regulations, subjecting them to enforcement action and fines. This risk is heightened by the difficulty of determining who is on the other side of the Bitcoin transaction.

The Advisory highlights these concerns, while also noting that certain pre- and post-breach actions could mitigate OFAC exposure. Implementing “a risk-based compliance program” pre-breach and promptly making a “complete report of a ransomware attack to law enforcement” after an attack can, according to the Advisory, mitigate enforcement.

OFAC compliance may not be the only regulatory hurdle to overcome if momentum is moving in favor of payment. In the summer of 2021, following a string of massive ransomware attacks including the Colonial Pipeline attack referred to above, four states proposed legislation that would ban ransom payments. These efforts were not successful to date, but organizations need to consider regulatory limitations on ransom payments as privacy and cybersecurity laws rapidly evolve.

Ransomware attacks can happen to any organization, large or small. It is critical that organizations not only strengthen their systems to prevent such attacks. They should also strengthen their preparedness should an attack happen. This includes thinking through ahead of time the organization’s approach to the question of whether pay or not to pay ransom.

States continue to tinker with their breach notification laws. The latest modification to the Indiana statute relates to the timing of notification. On March 18, 2022, Indiana Governor Eric Holcomb, signed HB 1351 which tightens the rules for providing timely notice to individuals affected by a data breach.

Prior to the change, the relevant section read:

a person required to make a disclosure or notification under this chapter shall make the disclosure or notification without unreasonable delay

After the change, which is effective July 1, 2022, it reads:

a person required to make a disclosure or notification under this chapter shall make the disclosure or notification without unreasonable delay, but not more than forty-five (45) days after the discovery of the breach.

As revised, the statute attempts to make clear the last date by which notification must be provided – not later than 45 days after discovery. But there remains a question about whether notification could be (or should be) provided before that 45-day period ends. After discovery of a breach, a period of delay in notification is permitted if it is reasonable. The current law describes reasonable delay, as follows:

[A] delay is reasonable if the delay is:

(1) necessary to restore the integrity of the computer system;

(2) necessary to discover the scope of the breach; or

(3) in response to a request from the attorney general or a law enforcement agency to delay disclosure because disclosure will:

(A) impede a criminal or civil investigation; or

(B) jeopardize national security.

IC 24-4.9-3-3(a).

This analysis can become significantly more complicated when residents in multiple states are affected by the breach. Although some states have a similar 45-day notification deadline (e.g., Alabama, Maryland, Ohio, and Wisconsin), other states have a shorter periods (e.g., 30 days in Florida) or a longer period (e.g., 60 days in Connecticut). Additionally, in our experience as breach counsel, covered entities with breach reporting obligations can expect heightened attention by the Indiana Attorney General’s Office to investigate perceived delays in notification. This change reflects that focus on timeliness and may become the source of greater enforcement activity in Indiana.

On May 20, 2022, the Federal Trade Commission’s Team CTO and the Division of Privacy and Identity Protection published a blog post entitled, “Security Beyond Prevention: The Importance of Effective Breach Disclosures.” In the post, the FTC takes the position that in some cases there may be a de facto data breach notification requirement, despite there currently being no section of the Federal Trade Commission Act or implementing regulation imposing an express, broadly applicable data breach notification requirement. Businesses should nonetheless take this de facto rule into account as part of their incident response plans.

The post stresses the importance of strong incident detection and response processes, noting they are vital to maintaining reasonable security.  The notification component can prevent and minimize consumer harm from breaches because, among other things, it can spur consumers to take actions to mitigate harm resulting from the breach. According to the FTC, failure to maintain such practices could indicate a lack of competition in the marketplace. Notably, the post states:

Regardless of whether a breach notification law applies, a breached entity that fails to disclose information to help parties mitigate reasonably foreseeable harm may violate Section 5 of the FTC Act.

The American Recovery and Reinvestment Act of 2009 directed the FTC to establish rules to require notification to consumers when the security of their individually identifiable health information has been breached. However, those rules apply only to vendors of personal health records and related entities, although a recent FTC policy statement clarified the application of the rule. In support of the blog post’s more broadly applicable de facto requirement, the FTC discussed some recent enforcement actions.

The post referred to the recent settlement with an online retailer that allegedly failed to timely notify consumers and other relevant parties after data breaches, thereby preventing parties from taking measures to mitigate harm.  The FTC viewed this as an unfair trade practice. Looking to other enforcement actions as examples, the post explained that deceptive statements can hinder consumers from taking critical actions to mitigate foreseeable harms like identity theft, loss of sensitive data, or financial impacts. Looking at these cases together, the post concluded that:

companies have legal obligations with respect to disclosing breaches, and that these disclosures should be accurate and timely.

As any victim of a data incident or experienced breach counsel knows, a critical part of just about any security incident is determining whether there has been a breach and whether notification is required. For an incident affecting individuals in a significant number of countries and/or states, navigating the various data breach statutes and regulations is challenging. According to the FTC’s post, even if that process leads to the conclusion that notification is not required under state law, for example, the FTC’s de facto rule may apply to avoid an allegation of unfair or deceptive trade practice.

Business, therefore, should review their incident response plans in light of this informal guidance.

When the California Consumer Privacy Act of 2018 (CCPA) became law, it was only a matter of time before other states adopted their own statutes intending to enhance privacy rights and consumer protection for their residents. After overwhelming support in the state legislature, Connecticut is about to become the fifth state with a comprehensive privacy law, as SB 6 awaits signature by Governor Ned Lamont.

If signed, the “Act Concerning Personal Data Privacy and Online Monitoring” (Act) will take effect July 1, 2023, the same day as the Colorado Consumer Privacy Act.

Key Elements

As noted, the Act largely tracks the Virginia Consumer Data Protection Act (VCDPA) and has the following key elements:

  • Jurisdictional Scope. The Act would apply to persons that conduct business in Connecticut or that produce products or services that are targeted to residents of Connecticut and that during the preceding calendar year: (i) controlled or processed personal data of at least 75,000 consumers (under the VCDPA this threshold is at least 100,000 Virginians) or (ii) controlled or processed personal data of at least 25,000 consumers and derived over 25 percent of gross revenue from the sale of personal data (50 percent under the VCDPA).
  • Exemptions. The Act provides exemptions at two levels, the entity level and the data level. Entities exempted from the Act include (i) agencies, commissions, districts, etc. of the state or political subdivisions, (ii) nonprofits, (iii) higher education, (iv) national securities associations, (v) financial institutions or data subject to Gramm-Leach-Bliley Act (GLBA), and (vi) covered entities and business associates as defined under HIPAA.

The Act also exempts a long list of categories of information including protected health information under HIPAA and certain identifiable private information in connection with human subject research. The Act also exempts certain personal information under the Fair Credit Reporting Act, Driver’s Privacy Protection Act of 1994, Family Educational Rights and Privacy Act, and other laws. In general, exempt data also includes data processed or maintained (i) in the course of an individual applying to, employed by or acting as an agent or independent contractor to the extent that the data is collected and used within the context of that role, (ii) as emergency contact information, or (iii) that is necessary to retain to administer benefits for another individual relating to the individual in (i) above.

  • Personal Data. Similar to the CCPA and GDPR, the Act defines personal data broadly to include any information that is linked or reasonably linkable to an identified or identifiable individual, but excludes de-identified data or publicly available information. However, maintaining deidentified information is not without obligation under the Act. Controllers that maintain such information must take reasonable measures to ensure that the data cannot be reidentified. They must also publicly commit to maintaining and using de-identified data without attempting to reidentify it. Finally, the controller must contractually obligate any recipients of the de-identified data to comply with the Act.
  • Sensitive Data. Similar to the VCDPA, the Act includes a category for “sensitive data.” This is defined as (i) data revealing racial or ethnic origin, religious beliefs, mental or physical health condition or diagnosis, sex life, sexual orientation or citizenship or immigration status, (ii) the processing of genetic or biometric data for the purpose of uniquely identifying an individual, (iii) personal data collected from a known child, or (iv) precise geolocation data.  Notably, sensitive data cannot be processed without consumer consent. In the case of sensitive data of a known child, the data must be processed according to the federal Children’s Online Privacy Protection Act (COPPA).  Also, controllers must conduct and document a data protection assessment specifically for the processing of sensitive data.
  • Consumer. The Act defines “consumer” as “an individual who is a resident of” Connecticut. Consumers under the Act do not include individuals acting (i) in a commercial or employment context or (ii) as employee, owner, director, officer or contractor of certain entities including a government agency whose communications or transactions with the controller occur solely within the context of that individual’s role with that entity.
  • Consumer Rights. Consumers under the Act would be afforded the following personal data rights:
    • To confirm whether or not a controller is processing their personal data and to access such personal data;
    • To correct inaccuracies in their personal data, taking into account the nature of the personal data and the purposes of the processing of their personal data;
    • To delete personal data provided by or obtained about them;
    • To obtain a copy of their personal data processed by the controller, in a portable and, to the extent technically feasible, readily usable format that allows them to transmit the data to another controller without hindrance, where the processing is carried out by automated means and without revealing trade secrets; and
    • To opt out of the processing of the personal data for purposes of (i) targeted advertising, (ii) sale, or (iii) profiling in furtherance of decisions that produce legal or similarly significant effects concerning them.
  • Reasonable Data Security Requirement. The Act affirmatively requires controllers to establish, implement, and maintain reasonable administrative, technical and physical data security practices to protect the confidentiality, integrity and accessibility of personal data appropriate to the volume and nature of the personal data at issue.
  • Data Protection AssessmentsThe Act imposes a new requirement for controllers: conduct data protection assessments (as mentioned above regarding sensitive data). Controllers must conduct and document data protection assessments for specific processing activities involving personal data that present a heightened risk of harm to consumers. These activities include targeted advertising, sale of personal data, profiling, processing of sensitive data. Profiling activities will require a data protection assessment when it would present a reasonably foreseeable risk of (A) unfair or deceptive treatment of, or unlawful disparate impact on, consumers, (B) financial, physical or reputational injury to consumers, (C) a physical or other intrusion upon the solitude or seclusion, or the private affairs or concerns, of consumers, where such intrusion would be offensive to a reasonable person, or (D) other substantial injury to consumers. When conducting such assessments controllers must identify and weigh the benefits that may flow, directly and indirectly, from the processing to the controller, the consumer, other stakeholders, and the public against the potential risks to the rights of the consumer. Controllers also can consider how those risks are mitigated by safeguards that can be employed by the controller. Factors controllers must consider include the use of de-identified data and the reasonable expectations of consumers, as well as the context of the processing and the relationship between the controller and the consumer whose personal data will be processed.
  • Enforcement. The Connecticut Attorney General’s office would have exclusive enforcement over the Act. During the first eighteen months the Act is effective, until December 31, 2024, controllers would be provided notice of a violation and will have a 60-day cure period. After that, the opportunity to cure may be granted depending on the Attorney General’s assessment of factors such as the number of violations, the size of the controller or processor, the nature of the processing activities, among others. Violations of the Act constitute an unfair trade practice under Connecticut’s Unfair and Deceptive Acts and Practices (UDAP) law. Under the UDAP, violations are subject to civil penalties of up to $5,000, plus actual and punitive damages and attorneys’ fees. The Act expressly excludes a private right of action.


Other states across the country are contemplating ways to enhance their data privacy and security protections. Organizations, regardless of their location, should be assessing and reviewing their data collection activities, building robust data protection programs, and investing in written information security programs.