On April 17th, the U.S. Supreme Court dismissed the highly anticipated U.S. v. Microsoft, ruling that recently enacted legislation rendered the case moot. Microsoft Corp. had been in litigation with the U.S. Department of Justice (DOJ) for several years over the issue of whether Microsoft must comply with a U.S. search warrant for access to customer’s emails and other personal data within its “possession, custody or control”, regardless of whether such data is stored within the U.S. or abroad. The Supreme Court’s ruling has been anticipated since March, when President Trump signed into law the Clarifying Law Overseas Use of Data Act (CLOUD Act), H.R. 4943, which amends a provision of the Electronic Communications Privacy Act of 1986 (ECPA), clarifying the federal government’s authority to access U.S. individuals’ data and communications stored abroad.

The dispute between Microsoft and the DOJ arose in 2013, when prosecutors served Microsoft with a warrant issued under the Stored Communications Act of 1986 (SCA), a provision of the ECPA, demanding that the company turn over personal emails and data of a user account associated with a criminal drug trafficking investigation. Microsoft complied with the warrant to the extent that such data was stored on servers in the U.S. However, a portion of the requested data was stored on a server in Ireland that Microsoft refused to turn over.

The Supreme Court agreed to hear the dispute in October 2017, after the U.S. Court of Appeals for the 2nd Circuit, in July 2016, quashed the warrant issued by the DOJ, holding in favor of Microsoft, which the DOJ appealed. In oral arguments before the Supreme Court in February, the DOJ and federal law enforcement argued that technology companies are disrupting criminal investigations in their refusal to turn over cloud data stored on servers abroad. It should not matter where data is stored if it can be accessed “domestically with a click of a computer mouse”, the DOJ argued. Conversely, Microsoft argued that the SCA, the basis for the DOJ’s warrant, was not equipped to address new technologies and usage.

The CLOUD Act, enacted on March 22nd, clarifies the federal government’s authority to compel data stored abroad and creates new procedures for issuing such warrants. The new legislation also affords a company the opportunity to move to quash a warrant on the basis that there is a “material risk” that the demand would violate foreign law.

Following passage of the CLOUD Act, the DOJ filed a motion to dismiss the case on grounds that the new legislation rendered the dispute moot, and stated that it would withdraw the original warrant and reissue a new one under the procedural requirements of the CLOUD Act, to which Microsoft, in a subsequent motion, agreed. “There is no reason for this court to resolve a legal issue that is now of only historical interest,” Microsoft stated in its motion.

The CLOUD Act has been broadly supported by both law enforcement and the technology sector, both in agreement that the 30-year-old SCA was in need of significant updates. Full implications of the new legislation will take time to become evident.

Last week, the Department of Health and Human Services’ Office for Civil Rights (OCR) provided guidance for HIPAA covered entities and business associates that use or want to use cloud computing services involving protected health information (PHI). Covered entities and business associates seeking cloud services often have many concerns regarding HIPAA compliance, and this guidance helps to address some of those concerns. The guidance also will help cloud service providers (CSPs) understand some of their obligations when serving the vast health care sector. Frankly, this guidance is helpful for any entity that desires to use cloud services to store, transfer or otherwise process sensitive information, including personal information. We summarized some of the key points in the guidance below.

CSPs that only store PHI and provide “no-view” services are not subject to HIPAA, right?

Wrong. OCR reminds everyone that when a covered entity engages a CSP to create, receive, maintain, store or transmit ePHI, on its behalf, the CSP is a business associate under HIPAA.  Likewise, when a business associate subcontracts with a CSP for similar services, the CSP is a business associate.

Practically, however, with regard to no-view services, CSPs and their HIPAA-covered customers can take advantage of the flexibility and scalability built into the HIPAA rules. OCR’s guidance points out that when a CSP is providing only no-view services, certain Security Rule requirements may be satisfied for both parties through the actions of one of the parties. For example, certain access controls, such as unique user identification, may be the responsibility of the customer (when the customer has sole access to ePHI), while others, such as encryption, may be the responsibility of the CSP.  Thus, the parties will have to review these issues carefully and modify the agreements accordingly.

Is this true even if the CSP processes or stores only encrypted ePHI and lacks an encryption key for the data?

Yes. Accordingly, the covered entity (or business associate) and the CSP must enter into a HIPAA-compliant business associate agreement (BAA), and the CSP is both contractually liable under the BAA and directly liable for compliance with the applicable requirements of the HIPAA Rules. Note that the absence of a BAA does not change that the CSP is a business associate subject to the applicable requirements under the rules, but the HIPAA covered entity would not have contractual protection, such as breach of contract claims and indemnity.

For entities not covered by HIPAA, you may have other legal obligations that apply when you decide to share certain information with a CSP. For example, rules in California and Massachusetts generally require businesses to obtain written agreements from third parties to safeguard the personal information they maintain for the business to perform the desired services.

So, if we use a CSP, we only have to worry about having a BAA in place?

Probably not. Use of cloud services likely will require the covered entity or business associate to perform a risk assessment to understand how those services will affect overall HIPAA compliance. Some of those compliance issues will be addressed in the BAA. However, contracting with a CSP often involves a “Service Level Agreement” or “SLA” which can raise other HIPAA compliance issues. For example, specific SLA provisions concerning system availability or back-up and data recovery may not be permissible under HIPAA. Entities not covered by HIPAA have similar needs to ensure that the cloud services will meet their needs with respect to these and other issues, such as return of data following termination of the SLA.

If data is encrypted in the cloud, is HIPAA satisfied?

No. Strong encryption reduces risk to PHI for sure, but does not maintain its integrity and availability. That is, for example, encryption does not ensure that ePHI is not corrupted by malware, or that it will remain available to authorized persons during emergency situations. Further, encryption does not address other administrative and physical safeguards. For example, even when the parties have agreed that the customer is responsible for authenticating access to ePHI, the CSP may still need to implement appropriate internal controls to assure only authorized access to administrative tools that manage resources (e.g., storage, memory, network interfaces, CPUs).  The SLA and the BAA are important vehicles for confirming which entity is responsible for these requirements.

Can CSPs block our access to PHI?

No. Blocking a covered entity’s access to PHI would violate the Privacy Rule. Thus, for example, an SLA cannot contain a provision that allows the CSP to block access to ePHI to resolve a payment dispute. Note this may not be the case with arrangements not covered by HIPAA. Accordingly, owners of the data in these situations need to proceed with care when negotiating and disputing payment under come SLAs.

Do CSPs have to report “pings” and others unsuccessful security incidents?

In general, the answer is yes. Security Rule § 164.314(a)(2)(i)(C) provides that a BAA must require the business associate to report any security incidents of which it becomes aware. A security incident means the attempted or successful unauthorized access, use, disclosure, modification, or destruction of information or interference with system operations in an information system.  However, the Security Rule is flexible and does not prescribe the level of detail, frequency, or format of reports of security incidents, which may be worked out in the BAA.  Thus, the parties should consider different levels of detail, frequency, and formatting of reports based on the nature of the security incidents.

Does HIPAA permit PHI to be stored in the cloud outside of the United States?

In short, the answer is yes. But, as noted above, the covered entity or business associate needs to consider the applicable risks.

 

Cloud services can yield substantial cost savings and offer substantial convenience to users. CSPs also tend to offer a higher level of sophistication in the area of data security than most health care providers and their service providers. But the failure to think carefully about adoption and implementation of these services can create substantial exposure for the company. Significant exposure can result not only from a breach of PHI in the cloud environment, but also from the failure to appropriately consider and document the risks relating to that environment.

 

Whether Google Docs, Dropbox, or some other file sharing system, employees, especially millennials and other digital natives, are increasingly likely to set up personal cloud-based document sharing and storage accounts for work purposes, usually with well-meaning intentions, such as convenience and flexibility. Sometimes this is done with explicit company approval, sometimes it is done with tacit awareness by middle management, and often the employer is unaware of this activity.

When an employee quits or is terminated, however, that account, and the business documents it contains, may be locked away in an inaccessible bubble. Worse, the employee could access trade secrets and other information stored in the cloud to unfairly compete. For example, in 2012, the computer gaming company Zynga sued a former employee for uploading trade secrets onto the employee’s personal Dropbox account before leaving to work for a competitor. At a minimum, it may take time to recover the information or obtain the user name and password from the former employee.  Storage of proprietary information, especially personally identifiable information (PII) on personal cloud accounts also increases the risk of a company data breach if the information is hacked.  Finally, allowing business documents to be stored outside of the system can also create headaches when enacting a litigation hold or responding to electronic discovery requests in litigation. What should employers be doing now, to address this trend?

Interested in reading more? Please see the full post on the Non-Compete and Trade Secrets Report.

Government contractors have a wide range of unique challenges (find out more about these here), not the least of which is data security. A good example is the interim rule the Department of Defense (DoD) issued last month that implements sections of the National Defense Authorization Act for Fiscal Years 2013 and 2015. In short, these provisions expand the incident reporting requirements for contractors and increase the security requirements for cloud service providers.

The Secretary of Defense determined that “urgent and compelling” reasons exist to issue the interim rule without prior opportunity for public comment. There is an urgent need to protect covered defense information and to increase awareness of the full scope of cyber incidents being committed against defense contractors. The use of cloud computing has greatly increased, according to the Secretary, and has increased the vulnerability of DoD information. The recent high-profile breaches of Federal information also influenced this determination. It is easy to see how similar considerations will influence other federal and state agencies to tighten their data security requirements on their contractors and subcontractors.

The hope here is that the rule will increase the cyber security on DoD information on contractor systems, help to mitigate risk, and gather information for the development of future improvements in cyber security. Note that the DoD will consider public comments to the interim rule before issuing the final rule. Comments must be submitted on or before October 26, 2015 to be considered.

Incident Reporting Highlights

  • Contractors and subcontractors must report cyber incidents that result in an actual or potentially adverse effect on a covered contractor information system or covered defense information residing on that system, or on a contractor’s ability to provide operationally critical support.
  • A “cyber incident” means actions taken through the use of computer networks that result in a compromise or an actual or potentially adverse effect on an information system and/or the information residing therein. A “compromise” is the disclosure of information to unauthorized persons, or a violation of the security policy of a system, in which unauthorized intentional or unintentional disclosure, modification, destruction, or loss of an object, or the copying of information to unauthorized media may have occurred.
  • Rapid reporting is required – this means 72 hours of discovery of a cyber incident.
  • The DoD recognizes that the reporting may include the contractor’s proprietary information, and will protect against the unauthorized use or release of that information.
  • The reporting of a cyber incident will not, by itself, be interpreted as evidence that the contractor or subcontractor has failed to adequately safeguard covered defense information.

Cloud Computing Highlights

  • Contracts for cloud computing services may be awarded only to providers that have been granted provisional authorization by Defense Information Systems Agency, at the appropriate level.
  • Cloud computing service providers must maintain government data within the 50 states, the District of Columbia, or outlying areas of the United States, unless physically located on DoD premises. Government data can be maintained outside the U.S. upon written notification from the contracting officer.
  • Government data means any information, document, media, or machine readable material regardless of physical form or characteristics, that is created or obtained by the government in the course of official government business.
  • Purchase requests for cloud computing service must, among other things, describe government data and the requirement for the contractor to coordinate with the responsible government official to respond to any “spillage” occurring in connection with the services. Spillage happens when a security incident results in the transfer of classified or controlled unclassified information onto an information system not authorized for the appropriate security level.

Defense contractors and their subcontractors will need to review the interim rule carefully and make adjustments. Of course, the focus here is not solely on personal identifiable information, but the same principles apply. Maintaining a well-thought out and practiced incident response plan is critical.

Kentucky Gov. Steve Beshear signed H.R. 232 on April 10, 2014, making the Commonwealth the 47th state to enact a data breach notification law. The law also limits how cloud service providers can use student data. A breach notification law in New Mexico may follow shortly.

Data Breach Notification Mandate

The Kentucky law follows the same general structure of many of the breach notification laws in the other states:

  • A breach of the security of the system happens when there is unauthorized acquisition of unencrypted and unredacted computerized data that compromises the security, confidentiality, or integrity of personally identifiable information maintained by the information holder as part of a database regarding multiple individuals that actually causes, or leads the information holder to reasonably believe has caused or will cause, identity theft or fraud against any resident of Kentucky. The law does not refer to “access” only acquisition, and appears to have a risk of harm trigger.
  • The good faith acquisition of personally identifiable information by an employee or agent of the information holder for the purposes of the information holder is not a breach if the personally identifiable information is not used or subject to further unauthorized disclosure.
  • “Personally identifiable information” means an individual’s first name or first initial and last name in combination with the individual’s (i) Social Security number, (ii) Driver’s license number; or (iii) Account number, credit or debit card number, in combination with any required security code, access code, or password permit access to an individual’s financial account.
  • The notification required under the law must be made in the most expedient time possible and without unreasonable delay, consistent with the legitimate needs of law enforcement or any measures necessary to determine the scope of the breach and restore the reasonable integrity of the data system.
  • Notice may be provided in writing and can be provided electronically if the E-Sign Act requirements are met. For larger breaches, the law also contains substitute notice provisions similar to those in other states.
  • If notification is required to more than 1,000 Kentuckians at one time under this law, all nationwide consumer reporting agencies and credit bureaus also must be notified of the timing, distribution and content of the notices. However, the law does not require the Kentucky Attorney General to be notified of the incident, as is the case in a number of other states such as California, Maryland, Massachusetts, New Hampshire, and New York.
  • The law excludes persons and entities that are subject to Title V of the Gramm-Leach-Bliley Act of 1999 and the Health Insurance Portability and Accountability Act of 1996 (HIPAA). Of course, covered entities, business associates and certain vendors have their own breach notification requirements.

Protections for Student Data In the Cloud

The law is designed to protect student data at educational institutions, public or private, including any administrative units, that serve students in kindergarten through grade twelve when stored in the “cloud”. We may see more of these kinds of laws, particularly in light of the Fordham Law School study on the topic. For purposes of this law, “student data” means

any information or material, in any medium or format, that concerns a student and is created or provided by the student in the course of the student’s use of cloud computing services, or by an agent or employee of the educational institution in connection with the cloud computing services. Student data includes the student’s name, email address, email messages, postal address, phone number, and any documents, photos, or unique identifiers relating to the student.

Cloud providers serving these institutions in Kentucky need to be aware of this law not only so they can take steps to comply, but because it requires the providers to certify in their services contracts with the educational institutions that the providers will comply with this new law.

Specifically, the law prohibits cloud computing service providers from “processing student data for any purpose other than providing, improving, developing, or maintaining the integrity of its cloud computing services, unless the provider receives express permission from the student’s parent.” Processing is defined pretty broadly, it means to “use, access, collect, manipulate, scan, modify, analyze, transform, disclose, store, transmit, aggregate, or dispose of student data.”

While the provider may assist an educational institution with certain research permitted under the Family Educational Rights and Privacy Act of 1974, also known as “FERPA,” it may not use the data to “advertise or facilitate advertising or to create or correct an individual or household profile for any advertisement purpose.” Finally, the provider may not sell, disclose, or otherwise process student data for any commercial purpose.

 

On December 13, 2013, Fordham Law School’s Center on Law and Information Policy published a study (Study) that paints a sobering picture of how many public schools across the country handle student data, particularly with respect to data they store and services they (and students) use in the “cloud.” There is little doubt that many school districts are strapped for cash and, indeed, utilizing cloud services provides a new opportunity for significant cost savings. However, according to the Study, some basic, low-cost safeguards to protect the data of the children attending these public school are not in place.

For example, some of the Study’s key findings include:

  • 95% of districts rely on cloud services for a diverse range of functions including data mining related to student performance, support for classroom activities, student guidance, data hosting, as well as special services such as cafeteria payments and transportation planning,
  • only 25% of districts inform parents of their use of cloud services,
  • 20% of districts fail to have policies governing the use of online services, and
  • with respect to contracts negotiated by districts with cloud service providers
    • they generally do not provide for data security and allow vendors to retain student information in perpetuity,
    • fewer than 25% specify the purpose for disclosures of student information,
    • fewer than 7% restrict the sale or marketing of student information, and
    • many districts have significant gaps in their contract documentation.

A data  breach can be significant for any organization, and school districts are not immune. Parents are also beginning to pressure districts for more action, particularly as children can be an attractive target for identity theft.

The Fordham Study provides a number of helpful recommendations for public school districts. Indeed, based on the Study and consistent with basic data privacy and security principles (not to mention FERPA and other laws concerning the safeguarding of student data), there seems to be quite a bit of low-hanging fruit school districts can use to address the risks identified. These include, for example, establishing basic, written privacy policies and procedures that apply to cloud and similar services, implementing more thorough vetting of vendors handling sensitive personal information, and adopting and implementing for consistent use a set of strong privacy and security contract clauses when negotiating with all vendors that will access personal and other confidential information.

If your cloud service provider sounds like your local weather reporter – partly cloudy with a chance of rain – you may be in for a data security storm. A USA Today guest essay by Rajiv Gupta highlights the need for a multi-layered approach for cloud providers to ensure data stored in the cloud is secure, something we’ve touched upon here before. Businesses need greater certainty concerning the security of their data in the cloud and should be pressing their cloud providers for a security forecast with more certainty than their local weather report.

As Mr. Gupta notes, “by 2020 nearly 40% of the information in the digital universe will be touched by cloud computing providers.” Many businesses recognize this trend and may already have business data and applications in the cloud. However, some may not realize that some of their data is in the cloud without their knowledge or authorization, and without having had an opportunity to vet the provider(s). For example, it has been found that as many as 1 in 5 employees use commercial cloud providers to store company information.

Mr. Gupta discusses a number of tactics cloud providers should employ to secure data in the cloud – encryption, contextual access control, data loss prevention technologies, audit trails, and enforcement of security policies from application to application. Good advice for cloud providers. But customers of the cloud need to think a little differently.

Purchasers of cloud data storage services need to have a sense of the multiple layers of security tactics that are recommended for cloud providers and see to it that their provider(s) have them in place. But they also need to be thinking about:

  • What protections does their company have if the cloud provider’s systems are breached?
  • Does the services agreement with the cloud provider adequately address security, data breach, indemnity, reporting and so on?
  • What policies do they have for their employees concerning the privacy, security, integrity and accessibility of company data when using the cloud? And, which cloud should they be using?
  • How would employees’ use of their personal commercial cloud services complicate a company’s litigation hold processes?
  • Who at the company and at the cloud provider has/should have access to the data?
  • Is the cloud service provider a business associate/subcontractor under HIPAA, prepared to comply with the HITECH Act? What about the agreement requirements under state law?
  • Where is the data stored? Is it in the United States, or in a foreign country subject to different data security standards?
  • What if the cloud goes down, out of business? Will company data and applications be accessible?
  • Are the businesses’ customers and clients on board with use of the cloud for their data?

These are just some of the key questions businesses should be asking about concerning use of the cloud. The technology can indeed yield substantial cost savings, but the failure to think carefully about its adoption and implementation can create substantial exposure for the company.

Under the HITECH Act, business associates are subject to the HIPAA privacy and security rules (the "HIPAA Rules") virtually to the same extent as covered entities. In addition to implementing this change for business associates ("BAs"), and providing additional guidance concerning what entities are business associates, the final HIPAA regulations issued last week also treat certain subcontractors of BAs as BAs directly subject to the HIPAA Rules. As a result of some of these changes, covered entities and BAs need to re-examine the relationships with their subcontractors to ensure they obtain the appropriate satisfactory assurances concerning the "protected health information" (PHI) they make available to those subcontractors.

Below are some of the key points from the final regulations concerning BAs and subcontractors:

  • Subcontractors. The final HIPAA regulations provide that subcontractors that create, receive, maintain, or transmit PHI on behalf of a BA are business associates. This is a significant expansion of the application of the HIPAA Rules; it makes subcontractors directly liable under the HIPAA Rules.

As a result of this change, just as covered entities need to ensure that they obtain satisfactory assurances concerning compliance with the HIPAA Rules (usually in the form of a business associate agreement, BAA) from their BAs, BAs must do the same with regard to certain subcontractors. This must continue no matter how far “down the chain” the PHI flows.

  • Business Associate Agreement Not Necessary to Establish Status as Business Associate. The final HIPAA regulations confirm that persons and entities that meet the definition of a BA have that status regardless of whether a "business associate agreement" is in place.
  • Data Storage Companies. Entities that maintain PHI (digital or hard copy) on behalf of a covered entity are BAs, "even if [they] do not actually view the [PHI]."  This provision may create significant compliance issues for cloud service providers, as well as hard copy document storage companies, that have access to the records of their clients but may never look at them. The conduit exception is a narrow one and only applies transmissions of data, not storage. 
  • Certain Groups Not Considered Business Associates.
    • Researchers generally are not considered BAs when performing research functions.
    • Banking institutions generally are not considered BAs with respect to certain payment processing activities (e.g., cashing a check or conducting a funds transfer)
    • Malpractice insurers generally are not considered BAs when providing services related to the insurance, but may be BAs when providing risk management and similar services to covered entities.

Transition rule for compliance. A transition rule under the final HIPAA regulations permits covered entities and BAs to continue to operate under certain existing contracts for up to one year beyond the compliance date (September 23, 2013) of the final regulations. A qualifying business associate agreement will be deemed compliant until the earlier of (i) the date such agreement is renewed or modified on or after September 23, 2013, or (ii) September 22, 2014. This rule only applies to the language in the agreements, the parties must operate as required under the HIPAA Rules in accordance with the applicable compliance dates. 

Covered entities and business associates may want to act more quickly to identify and contract with those individuals and entities from whom they must obtain satisfactory assurances under HIPAA.

As more companies move to the cloud, regulatory compliance remains a critical issue. For cloud service providers to the healthcare industry, it looks like the requirement to comply with the HIPAA privacy and security rules as business associates will be confirmed when long-awaited final regulations are issued, based on a report by Marianne Kolbasuk McGee with Healthcare Information Security. According to Ms. McGee’s report, Joy Pritts, chief privacy officer in the Office of the National Coordinator for Health IT, a unit of the Department of Health and Human Services, addressed this issue during a Jan. 7 panel discussion on cloud computing hosted by Patient Privacy Rights.

Cloud service providers would prefer to take the position that they are conduits to protected health information, and therefore not business associates, similar to the US Postal Service, and certain private couriers and their electronic equivalents. See HIPAA FAQ.  A conduit transports information but does not access it other than on a random or infrequent basis as necessary for the performance of the transportation service or as required by law. However, HHS has already noted that "a software company that hosts the software containing patient information on its own server or accesses patient information when troubleshooting the software function, is a business associate of a covered entity." See HIPAA FAQ

According to Ms. Pritts’ remarks in the report cited above, it appears that the modifications made to HIPAA under the Health Information Technology for Economic and Clinical Health (the HITECH Act), along with anticipated regulatory guidance, will remove any doubt that cloud service providers servicing HIPAA covered entities are "business associates." This would require, among other things, that covered entities enter into business associate agreements with their cloud providers, and that standard confidentiality clauses likely will be insufficient. Of course, covered entities, practitioners and others are looking forward to these long awaited regulations to help clarify this and other issues.

Last month, we briefly discussed "cloud computing," along with some issues that should be considered when deciding whether to adopt this new technology. Our post focused on data privacy and security issues.

As reported by Kim Hart, of The Hill’s Technology Blog, a December 9, 2009, Federal Communications Commission filing states that the Federal Trade Commission is in the process of investigating "cloud computing" to address some of the same concerns noted in the post referenced above – privacy and security concerns.

Companies operating in the cloud, or thinking of moving in that direction, ought to be on the lookout for regulation or guidance that could come from the FTC’s investigation.