As more companies move to the cloud, regulatory compliance remains a critical issue. For cloud service providers to the healthcare industry, it looks like the requirement to comply with the HIPAA privacy and security rules as business associates will be confirmed when long-awaited final regulations are issued, based on a report by Marianne Kolbasuk McGee with Healthcare Information Security. According to Ms. McGee’s report, Joy Pritts, chief privacy officer in the Office of the National Coordinator for Health IT, a unit of the Department of Health and Human Services, addressed this issue during a Jan. 7 panel discussion on cloud computing hosted by Patient Privacy Rights.

Cloud service providers would prefer to take the position that they are conduits to protected health information, and therefore not business associates, similar to the US Postal Service, and certain private couriers and their electronic equivalents. See HIPAA FAQ.  A conduit transports information but does not access it other than on a random or infrequent basis as necessary for the performance of the transportation service or as required by law. However, HHS has already noted that "a software company that hosts the software containing patient information on its own server or accesses patient information when troubleshooting the software function, is a business associate of a covered entity." See HIPAA FAQ

According to Ms. Pritts’ remarks in the report cited above, it appears that the modifications made to HIPAA under the Health Information Technology for Economic and Clinical Health (the HITECH Act), along with anticipated regulatory guidance, will remove any doubt that cloud service providers servicing HIPAA covered entities are "business associates." This would require, among other things, that covered entities enter into business associate agreements with their cloud providers, and that standard confidentiality clauses likely will be insufficient. Of course, covered entities, practitioners and others are looking forward to these long awaited regulations to help clarify this and other issues.

The $50,000 in penalties that the Office for Civil Rights (OCR) recently imposed on a health care provider in Idaho was due in part to allegations that the HIPAA covered entity had not conducted a risk assessment as required under the HIPAA privacy and security regulations. Of course, HIPAA is not the only law that requires a risk assessment. State laws, such as the Massachusetts data security regulations, contemplate and require a risk assessment in order to establish reasonable safeguards for personal information.

In short, this process involves examining what information the organization maintains, the nature of that information, how it moves through the organization and to/from its vendors, and the organization’s current set of safeguards in order to determine the vulnerabilities to that information in terms of privacy, security, accessibility and integrity. This process is critical to ensuring that privacy and security policies are appropriate for the organization. There are a number of resources to assist you in getting started – here are a couple:

Organizations that have performed risk assessements need to periodically re-evaluate their prior efforts based on changes in their business. So, whether your organization has not conducted a risk assessment, or it has been a few years since your last assessment, or there have been substantial changes in your business, this may be as good a time as any to make this a priority.

 

The U.S. Department of Health and Human Services’ (HHS) reported today its first settlement involving a breach of unprotected electronic protected health information (ePHI) affecting fewer than 500 individuals. According to a statement from the Office for Civil Rights Director Leon Rodriguez, “This action sends a strong message to the health care industry that, regardless of size, covered entities must take action and will be held accountable for safeguarding their patients’ health information.”

The breach occurred in June 2010, when an unencrypted laptop belonging to the Hospice of North Idaho (HONI) that contained ePHI of 441 patients was stolen. The Office for Civil Rights (OCR) learned of the incident when HONI reported it to OCR pursuant to the annual reporting requirement for breaches affecting fewer than 500 individuals under the Health Information Technology for Economic and Clinical Health (HITECH). When OCR investigated, it discovered "that HONI had not conducted a risk analysis to safeguard ePHI." OCR also reported that HONI did not have in place policies or procedures to address mobile device security as required by the HIPAA Security Rule. 

HONI agreed to pay HHS $50,000 to settle potential violations of the Security Rule.

 

One of the hottest topics throughout 2012 was the various states which passed, or enacted, legislation which prohibits employers from requiring current, or prospective, employees to disclose a user name or password for a personal social media account, such as Facebook or LinkedIn. In fact, this issue was recently featured in an article on nbcnews.com.

Notably, fourteen states introduced such legislation in 2012, with Michigan becoming the most recent state to enact such legislation when Governor Rick Snyder signed his state’s equivalent law (HB 5523) last Friday. As we have discussed, California, Delaware (dealing with students at colleges and universities), Illinois, Maryland, and New Jersey (pending Governor’s signature) also enacted laws on this issue in 2012.

We anticipate that other states will address this issue through legislation in 2013 and beyond. It is essential for businesses to be conscious of these new laws, and to carefully consider this issue whether or not the state in which they operate currently prohibits such conduct.

As a growing number of states pass laws to restrict employers from gaining access to employees’ personal social media accounts, what employees post in social media can be critical evidence in employment-related investigations and litigations. Check out my partner J. Gregory Grisham’s recent article in HR Professionals Magazine discussing a recent Sixth Circuit decision concerning this issue in an FMLA context. 

 

On Monday, the Office for Civil Rights released guidance regarding methods for de-identification of protected health information (PHI) in accordance with the HIPAA Privacy Rule and as required by the American Recovery and Reinvestment Act of 2009.

HIPAA covered entities and business associates recognize the increasing risks related to handling "protected health information." One way to reduce these risks is through the "de-dentification" process. When performed correctly, de-identification causes the remaining information to no longer constitute "protected health information," and therefore no longer subject to the HIPAA privacy and security rules.  

The OCR page provides greater detail, in question and answer format, concerning the two methods that can be used to satisfy the Privacy Rule’s de-identification standard:

  • "Expert Determination" –  a formal determination by a qualified expert.
  • "Safe Harbor" – the removal of specified individual identifiers as well as absence of actual knowledge by the covered entity (or business associate) that the remaining information could be used alone or in combination with other information to identify the individual.

Under either method, PHI is no longer protected by the Privacy Rule, but the remaining data has limited usefulness. However, the guidance also describes de-identification strategies that can minimize the loss of usefulness to the data. Of course, where de-identification is not practical, which is often the case, covered entities and business associates need to ensure compliance with HIPAA privacy and security rules.

California Governor Jerry Brown has signed into law (AB 2674) new requirements specifying when and how employers must respond to their employees’ requests for inspection and copying of their personnel files. The new requirements become effective January 1, 2013.

Click here for more information about the new law.

Have you received this letter? If you did, it is part of Attorney General Kamala D. Harris efforts to formally notify scores of mobile application developers and companies that they are not in compliance with one aspect of California’s privacy law. Letters are being sent out to up to 100 non-compliant apps at this time, starting with those who have the most popular apps available on mobile platforms. Even if you have not received the letter, you may want to think about whether you need to comply.

The California Online Privacy Protection Act (CalOPPA) requires commercial operators of online services, including websites and mobile and social apps, which collect personally identifiable information from Californians to conspicuously post a privacy policy. Privacy policies should address how companies collect, use, and share personal information. Companies can face fines of up to $2,500 each time a non-compliant app is downloaded.

This enforcement action by Attorney General Harris is directed at mobile and social app platforms, but CalOPPA applies more broadly – to all commercial operators of online services that collect personal identifiable information about Californians.

It also is important to note that CalOPPA is just one of a number of privacy laws that the Privacy Enforcement and Protection Unit is charged with enforcing. Created in 2012, the Privacy Unit’s mission is to enforce federal and state privacy laws regulating the collection, retention, disclosure, and destruction of private or sensitive information by individuals, organizations, and the government. This includes laws relating to cyber privacy, health privacy, financial privacy, identity theft, government records and data breaches.

The establishment of the Privacy Unit and this more recent enforcement of CalOPPA suggests California is stepping up the enforcement of its privacy laws. Privacy officers, security officers, compliance officers, information security officers, risk managers, and others in California and beyond should take stock of their compliance efforts and make adjustments where necessary.

The effects of a hurricane like Sandy should be a reminder to all businesses of the importance of disaster recovery planning. When these storms threaten there is no shortage of images of sandbags and plywood being used to prevent harm to companies’ bricks and mortar. However, rarely do we see steps businesses should be taking to protect their information and technology assets from natural disasters. Information and technology assets are essential to the success of most organizations, making appropriate preparations critical.

There are many aspects to comprehensive disaster recovery planning. Below are just a few of the key steps a company should take concerning its information and technology assets:

  • Have a clear purpose and avoid internal silos. Companies should be clear about what they are setting out to do and involve the appropriate segments of their organizations. Disasters do not just affect IT departments, they also affect the sales force, human resources, legal, finance, and top management. Leadership from these and other business segments need to be at the table to ensure, among other things, appropriate coordination among the segments and an awareness of all available company resources. Excluding critical segments from the process will make it difficult to carry out the next critical step – assessing the risks.
  • Assess risks. Before a company can develop a disaster recovery plan, it must first identify the information and technology assets it needs to protect, their locations, their role to the success of the business, their associated costs and the overall and specific risks that apply to those assets. Different disasters pose different risks and require different safeguards. It also is important to analyze how the businesses’ operations would be affected upon the loss of vital components and assets, including identifying what information and technology systems are needed to safely keep the doors open.
  • Employee safety. Information and technology assets are critically important, but not at the expense of human life. Employees need to be reminded that their safety comes first.
  • Develop your plan. Having involved key personnel and assessed the risks, the business is in a position to develop an enterprise-wide disaster recovery plan. Such a plan might include the following specific steps:
    • Establish redundancies. If a data center in lower Manhattan is underwater, being able to switch to another in California, Texas or another part of New York State will be essential to business continuity. The same is true for voice and electronic communications systems.
    • Regular backups. Frequent and regular backups are critical to ensuring the preservation of important company data, as well as the data it may maintain for others. Companies also have to consider the integrity and accessibility of that data, which easily can be compromised by certain disasters.
    • Train employees. No one likes fire drills, but they serve a valuable purpose. Companies should not wait for a disaster in order for employees to learn about the company’s disaster recovery program.
  • Update plan. As the business changes, grows, and adds locations and new people, the disaster recovery plan also may need to change to address those changes. A regular review of the plan is critical.

So, as you clean up from Sandy, think about whether your disaster recovery plan worked the way you expected. If it did not, make appropriate changes. If you think your company could have benefited from such a plan, there is no time like the present to begin developing one.