What do ransomware, Yelp, and website tracking technologies all have in common? They are troubling areas of concern for HIPAA covered entities and business associates, according to one official from the federal Office for Civil Rights (OCR) which enforces the HIPAA privacy and security rules. Recently, the Executive Editor of Information Security Media Group’s (ISMG’s) HealthcareInfoSecurity.com media site, Marianne Kolbasuk McGee, sat down with Susan Rhodes, the OCR’s acting deputy for strategic planning and regional manager to discuss these issues.

We briefly summarize the discussion below, but you can access the short interview here (~10 min.). It is worth a listen.

Ms. Rhodes outlined three troublesome areas that OCR is watching closely:

  • Hacking/ransomware. Obviously, this continues to be a significant problem for the healthcare sector. According to Ms. Rhodes, ransomware attacks are up 278% in the last 5 years. Developing, maintaining, and practicing an incident response plan is one important tool for dealing with these and other attacks.
  • Online reviews. Negative comments made by customers/patients on popular online review services, such as offered by Yelp and Google, can be upsetting for any small business. Practitioners in the health care sector, such as physicians, dentists, etc. have to be particularly careful when responding to patient complaints on such platforms, if they respond at all. Their responses could result in the wrongful disclosure of protected health information of their patients, resulting in significant OCR enforcement actions such as occurred here and here.
  • Website tracking technologies. Calling this a “hot” area and referencing OCR investigations across the country, Ms. Rhodes directed listeners to the OCR guidance on tracking technologies issued in December 2022. Specifically, she reminded HIPAA covered entities of key considerations when using website tracking technologies including, without limitation, the potential need for business associate agreements and patient consent.

Ms. McGee also inquired about areas where covered entities and business associates’ HIPAA compliance frequently falls short. Ms. Rhodes mentioned a few:

  • Risk analysis – which is foundational to the policies and procedures adopted by covered entities and business associates.
  • Access controls – in short, making sure employees and other workforce members at the covered entity or business associate only have access to the PHI needed to perform their job.
  • Audit controls – regularly reviewing system activity, log files, etc. to identify irregular activity or potential compromises to PHI.

The HIPAA privacy and security rule continue to raise significant compliance challenges for covered entities and business associates. It is important to those that those challenges do not just exist in the physician’s office, but must be managed on line as well, including on organizations’ website.

The healthcare sector is a prime target for data breaches. According to a summary by the HIPAA Journal, 32% of all data breaches between 2015 and 2022 were in the healthcare sector, “almost double the number recorded in the financial and manufacturing sectors.” Industry analysts cite to many reasons for this, including the sensitivity of health data and its value on the black market compared to other forms of data. Evidently, another driver of data breaches for healthcare entities is M&A activity.

A recent study suggests that the likelihood for hospitals to experience a data breach doubles during the year before and after a merger. As some expect an increase in hospital mergers in the coming year, one can expect the number of healthcare data breaches to increase.

According to the research, Nan Clement, a Ph.D. candidate in economics in the School of Economic, Political and Policy Sciences in the University of Texas at Dallas looked at reporting on data breaches from the Office for Civil Rights during the period 2010 to 2022. Based on her analysis, for the two-year period surrounding a transaction closing (one year before and after the closing date), the chances of a data breach was 6%, compared to 3% for hospitals that merged but were outside that two-year period.

The study also looked at some of the potential reasons for this uptick:

  • Increase interest from hackers – data from Google Trends showed a “connection between increases in searches for a target hospital’s name with increases in hacking activity” which may stem from increased media attention around the merger.
  • Incompatibility of information systems – trying to merge data on different electronic medical record (EMR) platforms.
  • Increases in insider misconduct

Another reason may be simply a diversion of focus from the day to day administrative functions at the hospital considering how disruptive a merger can be. The FBI also issued a notification advising that ransomware actors target companies involved in significant, time-sensitive financial events to incentivize ransom payment by victims.

We have discussed here data security issues that can arise in the course of a transaction. For any entity involved in M&A activity, especially in the healthcare sector, it is critical to stay focused and realize that the organization may be more of a target at this time. Heightened awareness by the organization’s information security team and increased training and reminders to staff about phishing and other forms of attack could help avoid a data breach during this more vulnerable period. Additionally, the transacting parties might consider this risk and take appropriate steps during the due diligence stage both to protect against an attack, but also to be prepared to respond should one occur.

The Cyber Safety Review Board (Board) issued a report entitled, Review of the Attacks Associates with Lapsus$ and Related Threat Groups (Report), released by the Department of Homeland Security on August 10, 2023. The Report begins with a message from the Board’s Chair and Vice Chair discussing WarGames, a movie with interesting parallels to the present day – the leveraging of AI and large language models into systems (see Joshua/WOPR) and teenagers compromising sophisticated systems (Matthew Broderick as a high school student hacking into the Dept. of Defense). The Report looks at “Lapsus$,” described as a loosely organized group of threat actors, that included juveniles in some cases, which gained lots of attention after providing a window into its inner workings.

“Lapsus$ made clear just how easy it was for its members (juveniles, in some instances) to infiltrate well-defended organizations.”

Established under President Biden’s Executive Order (EO) 14028 on ‘Improving the Nation’s Cybersecurity’, the role of the Board is to review major cyber events and make concrete recommendations that would drive improvements. The Report does not disappoint in terms of its description of the targeting and nature of attack by Lapsus$ and similar groups, as well as the Board’s recommendations, one being to move toward a “passwordless” world.

While we cannot cover all of the critical and helpful information in the 59-page Report, here are a few highlights.

Multi-factor Authentication Implementations Used Broadly Today are Insufficient.

A reliable joke at any data security conference is how “password” or “123456” continue to be the most popular passwords. Another weakness is the use of the same account credentials across multiple accounts. Multi-factor authentication (MFA) was designed to address these practices by going beyond the password to require one or more additional authenticators before access is permitted. MFA often comes highly recommended to help protect against one of the most financially damaging online crimes, business email compromise (BEC).

Perhaps a bit unsettling for many that have implemented MFA thinking it is the answer to system access vulnerabilities, the Report explains:

the Board saw a collective failure to sufficiently account for and mitigate the risks associated with using Short Message Service (SMS) and voice calls for MFA. In several instances, attackers gained initial access to targeted organizations through Subscriber Identity Module (SIM) swapping attacks, which allowed them to intercept one-time passcodes and push notifications sent via SMS, effectively defeating this widely used MFA control. A lucrative SIM swap criminal market further enabled this pay-foraccess to a target’s mobile phone services. Despite these factors, adopting more advanced MFA capabilities remains a challenge for many organizations and individual consumers due to workflow and usability issues.

As expected, however, some methods of MFA are better than others. The Report observed that application or token-based MFA methods, for example, were more resilient.

If you are not familiar with SIM swaps, the process goes something like this, as detailed in the Report:

  1. Attacker collects data on victim through social media, phishing, etc.
  2. Attacker uses victim’s credentials to request SIM swap from telecommunications provider.
  3. Telecommunications provider approves the attacker’s fraudulent SIM swap.
  4. With full account takeover, attacker can navigate MFA, access victim’s personal account, including their employer’s systems.

“Lapsus$ took over online accounts via sign-in and account recovery workflows that sent one-time links or MFA passcodes via SMS or voice calls”

Insider Recruitment

Many organizations might not realize or want to believe it, but employees are vulnerable to monetary incentives to assist with providing system access to the attackers. The Report notes that in some cases these incentives could be as high as $20,000 per week. Compromised employees might hand over access credentials, approving upstream MFA requests, conduct SIM swaps, and perform other actions to assist the attackers with getting access to the organization’s systems.

Supply chain attacks

Lapsus$ and similar groups do not just directly attack organizations, they also go after targets that provide access to many organizations – third-party service providers and business process outsourcers (BPOs). Evidence of this strategy by threat actor groups are the recent attacks on secure file transfer services, such as Accellion and the GoAnywhere service offered by Fortra. By gaining access to these services, the attackers have entrée to files uploaded to these services by their many customers. 

Per the report:

In January 2022, a threat actor studied for this report gained access to privileged internal tools of a third-party service provider by compromising the computer of a customer support contractor from one of its BPOs. The real target of this attack was not the third-party service provider, nor the BPO, but rather the downstream customers of the service provider itself. This is a remarkable example of a creative three-stage supply chain attack used by this class of threat actors.

Recommendations

The Board outlines several recommendations, some are more likely to be within an organization’s power to mitigate risk than others. The recommendations fall into four main categories

  • strengthening identity and access management (IAM);
  • mitigating telecommunications and reseller vulnerabilities;
  • building resiliency across multi-party systems with a focus on business process outsourcers (BPOs); and
  • addressing law enforcement challenges and juvenile cybercrime.

As noted above, one of the strongest suggestions for enhancing IAM is moving away from passwords. The Board encourages increased use of Fast IDentity Online (FIDO)2-compliant, hardware backed solutions. In short, FIDO authentication would permit users to sign in with passkeys, usually a biometric or security key. Of course, biometrics raise other compliance risks, but the Board observes this technology avoids the vulnerability and suboptimal practices that have developed around passwords.

Another recommendation is to develop and test cyber incident response plans. As we have discussed on this blog several times (e.g., here and here), no system of safeguards is perfect. So, as an organization works to prevent an attack, it also must plan to respond should one be successful. Among other things, these plans should:

  • identify critical data, systems, and assets that should be prioritized during an attack,
  • outline a tested process for recovering from back-ups,
  • have an internal communications plan,
  • involve BPOs and third-party service providers in the developing and practicing of the plan,
  • identify and maintain contact information for internal and external individuals and groups that are critical to the response process – key employees, DFIR firms, law enforcement, outside counsel, insurance carriers, etc.

The Report is a great read for anyone involved in some way in addressing data risk to an organization. A critical take-away for anyone reading this report is threats are evolving and come in many forms. A control implemented in year 1 may become a significant vulnerability in year 2. Forty years later, the movie WarGames continues to be relevant, even if only to show that some of the most secure systems can be compromised by a handful of curious teenagers.

test

In a 2019 post about increasing cyber risks in K-12 schools, we cited a report, “The State of K-12 Cybersecurity: 2018 Year in Review,” that contained sobering information about cybersecurity in local school districts across the country. According to that report, in 2018, there were 122 publicly-disclosed cybersecurity incidents affecting school districts across 38 states. Not much has changed. A more recent article looking at ransomware activity in 2023 reports there being 120 attacks against school districts thus far in the year.

Yesterday, the Biden administration announced “new actions and private commitments to bolster the nation’s cyber defense at schools.” Among the actions:

Secretary of Education Miguel Cardona and Secretary of Homeland Security Alejandro Mayorkas, joined First Lady Jill Biden, to convene school administrators, educators and private sector companies to discuss best practices and new resources available to strengthen our schools’ cybersecurity, protect American families and schools, and prevent cyberattacks from disrupting our classrooms.

Perhaps more impactful in the short term are references in the announcement to (i) additional funding sources for schools, and (ii) recently released guidance, “K-12 Digital Infrastructure Brief: Defensible & Resilient,” jointly published by the U.S. Department of Education and the Cybersecurity and Infrastructure Security Agency (CISA). In particular, the guidance  outlines, among other things, some “High-Impact Recommendations,” such as implementing multifactor authentication. Potential sources for increased funding include an FCC proposed pilot program to provide $200 million over three years. School districts might also consider possibly allocating funds that remain available from the Elementary and Secondary School Emergency Relief Fund (ESSER Fund) established during COVID-19. Also, AWS recently pledged $20 million to a grant program designed to support for training and incident response at schools.  

While it is true that school districts are often understaffed and underfunded, including in the area of cybersecurity, there some areas of potential low-hanging (and relatively inexpensive) fruit more schools might be to address more readily. One of those is incident response.

Even if a district is not in a potion to take all the steps it might want to in order to prevent an attack, it might be able to vastly improve its plans to respond to an attack. Doing so, could significantly impact the disruption to students and related communities.

The White House report notes that during the prior academic year four school had to cancel classes or close completely in response to an attack.

Below are a few basic elements that should be included in an incident response plan (IRP):

  • identifying security incidents;
  • responding to security incidents; and
  • mitigating harmful effects of security incidents.

Certainly, each of these elements might look different district to district considering size, number of locations, information systems, prior experience, cyber insurance policies, type of personal information, and state laws. But they are important elements for any plan.

More specifically, school boards will want to think about who will be doing the responding – who is on the “security incident response team.” This is a team that is organized and trained to effectively respond to security incidents. Areas to consider when forming and building a team include:

  • A strong balance of skill sets among team members (IT, legal, communications, etc.)
  • Ensure lines of communication will be available among team members during a crisis
  • Consider external parties that can provide specific expertise concerning incident response
  • Commit to regularly practicing incident response procedures for different types of attacks.

Among other things, the IRP should help direct the team on mitigation efforts. Mitigation efforts are facilitated through measures such as contingency planning, robust data backup, and recovery processes. These are areas that should not be thought about by the school board or superintendent when a security incident occurs. For example, knowing that you have a backup of student data is not enough, regularly making sure you are able to restore from backups while maintaining integrity is key to minimizing disruption to the district.   

There is a lot that can be said about steps to take toward preparedness and IRP development, but the point is these are examples of measure that can be implemented more quickly and at less cost to help a district affected by a breach. Importantly, they can minimize the impact of the breach on the district and get kids back in the classroom more quickly.  

test

Recently, things may have sped up a little in your doctor’s office. The notes for your recent visit may have been organized and filed a little more quickly. You might have received assistance sooner than expected with a physician letter to your carrier concerning a claim. You also may have received copies of those medical records you have been waiting for, earlier than usual. Greasing the skids in these areas could be due to the use of generative AI technologies, such as ChatGPT, being leveraged in a myriad of ways across all industries, not just healthcare. But, as pointed out in this recent ScienceBlog post, some caution may be in order.

Healthcare providers and business associates subject to the Privacy and Security Rules under the Health Insurance Portability and Accountability Act (HIPAA) have some steps to consider before sharing protected health information (PHI) with a third party. One that should come as no surprise is determining whether the third party is a business associate. According to OCR guidance:

A “business associate” is a person or entity that performs certain functions or activities that involve the use or disclosure of protected health information on behalf of, or provides services to, a covered entity… The Privacy Rule lists some of the functions or activities, as well as the particular services, that make a person or entity a business associate, if the activity or service involves the use or disclosure of protected health information. The types of functions or activities that may make a person or entity a business associate include payment or health care operations activities, as well as other functions or activities regulated by the Administrative Simplification Rules.

Business associate functions and activities include: claims processing or administration; data analysis, processing or administration; utilization review; quality assurance; billing; benefit management; practice management; and repricing.  Business associate services are: legal; actuarial; accounting; consulting; data aggregation; management; administrative; accreditation; and financial. See the definition of “business associate” at 45 CFR 160.103.

If the third party is a business associate, sharing PHI with the third party must be preceded by entering into a business associate agreement. As a best practice, it also would be prudent to assess the cybersecurity policies and procedures of the third party as well. If the third party is not a business associate, in general, the authorization of the patient would be needed in order to share the information, unless there is a basis under HIPAA to make the disclosure without an authorization.

With ChatGPT or a similar service (offered by a third party to any physician’s office or health system), any doctor, nurse, office manager, administrator, or other staff member in a physician’s office or health system could set up an account (free or paid) and begin leveraging the large language model-based chatbot’s services. Those services might include help with various administrative tasks, such as those described above. The same might be true for workers employed by business associates providing services to physician office customers, for example.

Naturally, the question is whether ChatGPT or any similar service is a business associate under HIPAA. It may be, and it may be prepared to enter into a business associate agreement. Whether the service is HIPAA compliant would need to be an issue to assess when considering the applicable use cases that will be permitted. These are important determinations that any HIPAA covered entity or business associate needs to make before permitting the sharing of PHI on the chatbot’s platform, or that of any similar service. Of course, an alternative is to limit the use of these tools by policy and certainly provide some training. Employees at all levels of a healthcare organization simply may not appreciate flow of data, particularly considering the ease at which they could generate a quick patient letter or referral to a colleague.

For completeness, I asked ChatGPT: I am a healthcare provider, should I share patient data on ChatGPT. The chatbot responded:

As a healthcare provider, it is essential to prioritize patient privacy and adhere to data security regulations. Sharing patient data on this platform is not recommended, as it could compromise patient confidentiality. Instead, consider using secure and HIPAA-compliant communication channels for discussing sensitive patient information. If you have any medical or healthcare-related question, feel free to ask, and I’ll provide general information without requiring specific patient details.

The Department of Health and Human Services and the Federal Trade Commission have sent a joint letter to approximately 130 hospital systems and telehealth providers to emphasize the risks and concerns about the use of technologies, such as the Meta/Facebook pixel and Google Analytics, that can track a user’s online activities. We have summarized each agency’s position and guidance here and here, respectively.

The joint letter emphasizes the agencies’ focus on this issue, conveying to recipients a call to action:

To the extent you are using the tracking technologies described in this letter on your website or app, we strongly encourage you to review the laws cited in this letter and take actions to protect the privacy and security of individuals’ health information.

The joint letter reminds entities regulated by HIPAA that the HIPAA privacy and security rules apply when the information that such entities collect through tracking technologies or disclose to third parties (e.g., tracking technology vendors) includes PHI. As noted in the earlier HHS Bulletin on this issue, individuals with or without an existing patient relationship with the regulated entity could be sharing PHI with the entity (or a third party) through its website tracking technologies.

For those entities not subject to HIPAA, the joint letter affirms they still may have an obligation to protect against impermissible disclosures of personal health information under the FTC Act and the FTC Health Breach Notification Rule. The joint letter makes clear this would be the case:

 “even if you relied upon a third party to develop your website or mobile app and even if you do not use the information obtained through use of a tracking technology for any marketing purposes”

Regulated entities, including covered entities and business associates, should conduct an audit of any tracking technologies used on their websites, web applications, or mobile apps and determine if those technologies are being used in a manner that complies with HIPAA and the FTCA. They also should consider an ongoing process for managing their online assets to ensure such technologies are not implemented without appropriate vetting and risk assessment. Regulated entities also should review tracking technology vendor agreements to ensure they contain appropriate terms relating to the collection, use, processing, and disclosure of PHI or personal health information. This may require a business associate agreement. Additional considerations are provided in our earlier posts at the links above.  

On July 18, 2023, Oregon’s Governor signed Senate Bill 619 which enacts Oregon’s comprehensive consumer data privacy statute. Oregon joins California, Colorado, Connecticut, Indiana, Iowa, Montana, Tennessee, Texas, Utah, and Virginia in enacting a comprehensive consumer privacy law. Most of the sections of the law are scheduled to take effect on July 1, 2024, with a delayed effective date of July 1, 2025, for non-profit organizations.

When does the law apply?

The statute applies to any person that conducts business in the State of Oregon or that provides products or services to residents of the state and who during a calendar year, controls, or processes:

  • The personal data of 100,000 or more consumers, other than personal data controlled or processed solely for the purpose of completing a payment transaction; or,
  • The personal data of 25,000 or more consumers, while deriving 25 percent or more of the person’s annual gross revenue from selling personal data.

The following are some of the types of businesses that are exempted from the statute:  

  • A public corporation
  • Covered entities or business associates processing protected health information under the Health Insurance Portability and Accountability Act (HIPAA)
  • Organizations subject to the Gramm-Leach-Bliley Act.

Who is protected by the law?

The law protects consumers defined as a natural person who resides in the State of Oregon and acts in any capacity other than in a commercial or employment context.

What data is protected by the law?

Personal data that is protected under the statute is defined as “data, derived data or any unique identifier that is linked to or is reasonably linkable to a consumer or to a device that identifies, is linked to or is reasonably linkable to one or more consumers in a household.”

It does not include:

  • Deidentified data
  • Data that is lawfully available through federal, state, or local government records or through widely distributed media
  • Data the controller reasonably understood to have been lawfully made available to the public by the consumer.

The statute also includes biometric data under personal data. Under the legislation biometric data is defined as personal data generated by automatic measurements of a consumer’s biological characteristics, such as the consumer’s fingerprint, voice print, iris pattern, gait, or other unique biological characteristics that allow or confirm the unique identification of a consumer.

What are the rights of consumers?

Under the new legislation, consumers have the right to:

  • Confirm whether a controller is processing the consumer’s personal data and to access the personal data;
  • Correct inaccuracies in the consumer’s personal data;
  • Delete personal data provided by or obtained about the consumer;
  • Obtain a digital copy of the data the consumer previously provided, if available; and
  • Opt out of the processing of personal data for purposes of targeted advertising, the sale of personal data, or profiling in furtherance of a decision that produces a legal or similarly significant effect concerning the consumer.
  • Obtain a list of “specific third parties” to whom a controller discloses personal data.

What obligations do businesses have?

The legislation requires that businesses post a privacy policy that describes the categories of personal information it collects, the purpose of the collection, the categories of third parties with whom the personal information is shared, and an explanation of the consumer’s rights.

Covered businesses must also include a “clear and conspicuous” description of any processing done for the purpose of targeted advertising.

Eventually, covered businesses will be required to recognize universal opt-out mechanisms, though that portion of the statute does not take effect until January 1, 2026.

How is the law enforced?

The State Attorney General has exclusive authority to enforce the statute and it does not allow for a private right of action to enforce.

If you have questions about Oregon’s privacy law or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.

On June 7, 2023, Connecticut’s Governor signed Senate Bill (SB) 1103, which enacted regulations regarding artificial intelligence, automated decision-making, and personal data privacy. The law sets several requirements for state agencies’ development and use of automated systems for critical decisions, including the designation of an artificial intelligence officer under the Office of Policy and Management, who would be tasked with developing and adopting automated systems procedures for state agencies’ use.

Under the bill state contracting agencies will be prohibited from entering into a contract on or after July 1, 2023, unless the contract contains a provision requiring the business to comply with all applicable provisions of the state’s consumer data privacy law.

The bill expands exemptions from the consumer data privacy law to include any air carrier that is regulated under the Federal Aviation Act (FAA) and the Airline Deregulation Act.

Most of the bill’s provisions took effect on July 1, 2023.

If you have questions about the changes in Connecticut law or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.

On June 26, 2023, the Governor of Connecticut signed Senate Bill (SB) 3 which set forth new requirements related to consumer health data and protections for minors online.

As Connecticut’s comprehensive consumer privacy law took effect on July 1, 2023, the state has expanded privacy requirements under SB 3. Similar to Washington and Nevada, Connecticut sets standards for accessing and sharing consumer health data by private entities. The health data portions of the legislation took effect July 1, 2023.

Health Data Defined

Under the new legislation, consumer health data is defined as personal data that a controller uses to identify a consumer’s physical or mental health condition or diagnosis and includes but is not limited to gender-affirming health data and reproductive or sexual health data.

Certain types of information are excluded from coverage including protected health information under the Health Insurance Portability and Accountability Act (HIPAA).

Protections for Health Data

Covered entities are prohibited from collecting or sharing health data without a consumer’s consent. And health data may not be sold without a consumer completing a specified consent form.

The law also prohibits anyone from implementing a geofence to identify, track, collect data from, or send notifications or messages to a consumer that enters the virtual perimeter around a healthcare provider or facility.

Protections for Minors

Under the legislation, certain social media platforms are prohibited from establishing accounts for a minor under the age of 16 without a parent’s or guardian’s consent.

Moreover, covered platforms must delete a minor’s social media account and cease processing personal data within 10 days of receiving a request.

Some of the protections pertaining to minors do not take effect until October 1, 2024.

Enforcement

Under the legislation, any violation of either the consumer health data or online service provisions are enforced solely by the state attorney general. There is no private right of action created.

If you have questions about the changes to Connecticut’s Privacy Law or related issues, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

On June 7, 2023, Montana’s Governor signed Senate Bill (SB) 351 which revises the state’s privacy law pertaining to genetic information.

This legislation takes effect on October 1, 2023.

Covered Entity

Businesses covered by the legislation are any partnership, corporation, association, or public or private organization that:

  • Offers consumer genetic testing products or services directly to a consumer; or
  • Collects, uses, or analyzes genetic data

Consumer Defined

Under the legislation, consumers are defined as an individual who is a resident of the state of Montana.

Covered Entity Obligations

Under the legislation, covered entities have the following obligations:

  • Provide clear and complete information regarding the business’s policies and procedures for the collection, use, or disclosure of genetic data.
  • Obtain a consumer’s initial express consent for the collection, use, or disclosure of the consumer’s genetic data.
  • Obtain a consumer’s separate express consent for, amount others, the transfer or disclosure of the consumer’s genetic data to any person other than the company’s vendors and service providers.
  • Develop, implement, and maintain a comprehensive security program to protect a consumer’s genetic data against unauthorized access, use, or disclosure.

The law does not apply to the following:

  • Protected health information that is collected by a covered entity or business associate as defined under federal privacy requirements if separate informed consent is related to the collection, use, and dissemination is obtained from the consumer.
  • An entity when engaged in collecting, using, or analyzing genetic data or biological samples in the context of research pursuant to certain federal definitions.

Enforcement

The legislation is solely enforced by the state attorney general, who can initiate a civil enforcement action.

If you have questions about Montana’s new genetic data law or related issues, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.