Written by Jeffrey M. Schlossberg

When does a medical clinic’s employee’s unauthorized texting of patient confidential health information result in liability to the clinic? The answer; it depends.

In Doe v. Guthrie Clinic, Ltd., the Second Circuit Court of Appeals dismissed a patient’s claim against a medical corporation for alleged breach of fiduciary duty based on a non-physician employee’s unauthorized disclosure of confidential medical information. It did so because the New York State Court of Appeals answered the following certified question in the negative: “Whether, under New York law, the common law right of action for breach of the fiduciary duty of confidentiality for the unauthorized disclosure of medical information may run directly against medical corporations, even when the employee responsible for the breach is not a physician and acts outside the scope of her employment.”

In Doe, John Doe was treated at a clinic for a sexually transmitted disease (“STD”). A nurse, who knew Doe’s girlfriend, texted the girlfriend to let her know of Doe’s STD. Her texts were unrelated in any way to Doe’s treatment. After Doe learned of the texts, he complained to the clinic. The nurse was fired. The clinic acknowledged that Doe’s confidential information had been improperly accessed and disclosed and that appropriate disciplinary action had been taken. Doe then commenced a federal diversity action.

In analyzing the certified question presented, the State’s highest court declined to hold the clinic responsible under a claim of breach of fiduciary duty. Generally, a medical corporation might be vicariously liable for the wrongful acts of its employees, but under the doctrine of respondeat superior, liability extends only if those acts were committed in furtherance of the employer’s business. In Doe, the nurse’s conduct was not within the scope of her employment.

However, health care employers must still take caution. Despite the ruling in the case, the court did state that a medical corporation “may also be liable in tort for failing to establish adequate policies and procedures to safeguard the confidentiality of patient information or to train their employees to properly discharge their duties under those policies and procedures.” A health care practice that complies with the privacy and security regulations under HIPAA and applicable state law will be in a good position to avoid this kind of liability. Of course, inadequate policies addressing the protection of confidential patient information could expose the practice to damages in these kinds of suits, as well as penalties under HIPAA.

A New Jersey Appellate Court recently ruled that an employee who removes or copies her employer’s documents for use in her whistleblower or discrimination case may be prosecuted criminally for stealing.  In State v. Saavedra, the employee had taken highly confidential original documents owned by her employer, contending that she did so to support her employment discrimination suit and therefore she should be free from prosecution.  As we have detailed, both the trial court and the Appellate Court agreed that the employee’s taking of documents could sustain an indictment. 

Saavedra argued on appeal that the trial judge had erred because a 2010 New Jersey Supreme Court case had established an absolute right for employees with employment discrimination lawsuits to take potentially incriminating documents from their employers. The Appellate Division disagreed and found the Supreme Court case did not establish such a bright-line rule; instead, it said, the Supreme Court delivered a seven-part “totality-of-the-circumstances” test to determine whether a private employer can terminate its employee for the unauthorized taking of its documents.  The Court went on to hold that the State had put forth enough evidence before the grand jury to establish a prima facie case.

An employee’s taking company documents prior to or at the time of termination is a scenario that may be familiar to many employers.  While the taking of documents may often be discovered in connection with whistleblower or discrimination claims, many employers also discover that documents and/or confidential information have been taken in connection with non-compete or unfair competition matters.  The Court’s decision could have serious implications any time an employee takes documents and and illustrates that employers are not without recourse.

 

If you are a public sector employer, you may be particularly interested in an article written by my fellow shareholder and practice group member, Marlo Johnson Roebuck. She writes about a recent case, Graziosi v. City of Greenville, involving a police department’s decision to terminate a police officer for statements she made on Facebook.

As Marlo notes in her article, these situations can arise in all kinds of workplaces, no matter the city, state or country, and whether in the public or private sector. Here, Graziosi involved first amendment concerns because the police department is a public sector employer. But there could be a range of other issues that flow from employee activity in social media. These include inappropriate endorsements of company products and services by employees, complying with industry-specific regulatory guidance such as in the finance industry, disclosures of trade secrets, allegations of infringement on protected concerted activity rights under the National Labor Relations Act (NLRA), discrimination under the Americans with Disabilities Act (ADA) or the Genetic Information Nondiscrimination Act (GINA), and so on. Many of these issues and others are discussed in our Special Report – Social Media in the Workplace.

A well-crafted social media policy is a critical starting point. However, businesses need to also consider their game plan when, inevitably, the company learns about activity by one or more of their employees in social media that creates business, legal and other risks. Every situation is different and there will be twists and turns that have to be addressed at the time. However, thinking through certain strategies and approaches ahead of time can help the business to avoid some potentially risky missteps. For example, businesses should (i) consider having a process for determinining whether to investigate, (ii) think about who ought to be involved/coordinate the investigation, (iii) determine whether a third-party monitoring company should be engaged, and possibly develop a relationship with one ahead of time so that it will be ready to quickly step in as needed, (iv) examine whether current policies and laws limit the company’s ability to investigate and the scope of that investigation, (v) identify who should be responsible to manage client and business partner relationships that also might be affected by such an incident, and (vi) set out a plan for how to handle the information obtained in the investigation and what disciplinary steps, if any, should be taken. By no means exhaustive, thinking through a list like this certainly would help to prepare a company should it need to quickly address a flare up in social media that could have harsh consequences for the organization.

DPD

In honor of National Data Privacy Day, we provide the following “Top 14 for 2014.”  While the list is by no means exhaustive, it does provide critical areas businesses will need to consider in 2014.

  1. Location Based Tracking.  As the utilization of GPS enable devices becomes more and more prevalent, employers are often faced with the difficult decision of just how much information they may obtain about an employee’s whereabouts.  This is particularly true when an employee is absent from work, is traveling for business, or makes a representation as to their location which the employer questions for one reason or another.  The case law in this area is evolving rapidly, and both the public and private sector can expect to face this issue in the near future.
  2. Bans On Requesting Social Media Passwords. As we have previously discussed numerous states have passed legislation prohibiting employers from requiring current, or prospective, employees to disclose a user name or password for a personal social media account. 16 states introduced measures in 2013 and it is expected that many of these measures will be passed in 2014.
  3. Disaster Recovery Plans. Protecting information and technology assets from natural disasters and other emergencies is often an afterthought. This is especially relevant given the numerous weather difficulties faces by businesses through 2013, from floods to fires, to subzero temperatures.  However, developing a comprehensive disaster recovery plan now can avoid the significant expense, and often irretrievable loss of data, associated with natural disasters.
  4. BYOD. More and more businesses are realizing the risks of allowing employees to utilize their own electronic devices in the workplace and are turning to Bring Your Own Device (“BYOD”) programs to diminish some of these risks.  Businesses considering BYOD should review our comprehensive BYOD issues outline.
  5. User Generated Health Data.  The transformation of health information into electronic format has been well documented and will continue into the future.  However, one of the newest concerns for 2014 is health data which an individual voluntarily provides to track or chart their own health or fitness.  Devices such as Nike Fuelband, Fitbits, or similar devices or applications are allowing individuals to enter more and more health information about themselves electronically.  However, the privacy or security of this information is largely up for debate.
  6. Insurance. Like many other risks, information risk can be addressed in part through insurance. More carriers are developing products dealing with personal information risk, and specifically data breach response. This kind of coverage should be considered by any organization which maintains personal information.
  7. Risk Assessment. Many businesses remain unaware of how much personal and confidential information they maintain, who has access to it, how it is used and disclosed, how it is safeguarded, and so on. Getting a handle on a business’ critical information assets must be the first step, and is perhaps the most important step to tackling information risk. It is logically impossible to adequately safeguard something you are not aware exists. In fact, failing to conduct a risk assessment may subject the business to penalties under federal and/or state law.
  8. Develop a Written Information Security Program. Even if adopting a written information security program (WISP) to protect personal information is not an express statutory or regulatory mandate in your state (as it is in MA, MD, TX, CT, etc.), having one is critical to addressing information risk. Not only will a WISP better position a company when defending claims related to a data breach, but it will help the company manage and safeguard critical information, and may even help the company avoid whistleblower claims from employees.
  9. Training. A necessary component of any WISP and a required element under most federal and state laws mandating data security is training. In addition to meeting compliance requirements, training employees and supervisors also will not only aid in defending any potential breach of privacy claim that may be asserted against the company, but also may prevent a potential breach from occurring.
  10. HHS/OCR Investigations.  The Office of Civil Rights has recently stepped up its efforts to enforce the HIPAA Security Rule.  As we previously discussed, these enforcement activities are likely to increase in 2014 following a recent report from the Office of the Inspector General which concluded the OCR did not meets its federal requirements for oversight and enforcement.
  11. Develop a Plan for Breach Notification. All state and federal data breach notification requirements currently in effect require notice be provided as soon as possible. Failing to respond appropriately could result in significant liability.  This is true even when the number of individuals affected is relatively small.  Developing a breach response plan is not only prudent but also may be required under federal or state law.
  12. Investigating Social Media.  Social media continues to grow on a global scale, and the content available on a user’s profile or account is often being sought in connection with litigation.  In fact, failure to preserve relevant information in social media may have dire consequences.  Further, while public content may generally be utilized without issue, if private content is accessed improperly, serious repercussions can follow.
  13. New Technologies. As anyone who has purchased a phone or television in the last year has seen, technology is evolving extremely rapidly and a product which may be the “latest and greatest” today if often outdated 6 months down the road.  Staying familiar with these types of technologies and their capabilities will only allow businesses to better address any potential issues or concerns which may be implicated, including how those technologies address information risk.
  14. Watch for New Legislation. Today, managing data and ensuring its privacy, security and integrity is critical for businesses and individuals, and is increasingly becoming the subject of broad, complex regulation. As no national law requiring the protection of personal information has yet to be passed in the U.S., companies are left to navigate the constantly evolving web of growing state legislation. Companies therefore need to stay tuned in order to continue to remain compliant and competitive in this regard.

As one nursing facility in New York has learned, asking employees or applicants about their family medical history can violate the Genetic Information Nondiscrimination Act (“GINA”) and draw the ire of the U.S Equal Employment Opportunity Commission (EEOC). Founders Pavilion, Inc., a former Corning, N.Y. nursing and rehabilitation center, will pay $370,000 to settle discrimination claims, the agency reported.

To help avoid these kinds of claims, check out our GINA FAQs. As discussed in those FAQs, under the GINA regulations, employers can take steps to protect themselves from liability such as by using a GINA “Safe Harbor Notice” – a statement directing the doctor not to ask for and/or disclose “genetic information” (i.e., family medical history).

According to the EEOC, the nursing facility requested family medical history as part of its post-offer, pre-employment medical exams of applicants. Subject to limited exceptions, GINA prevents employers from requesting genetic information or making employment decisions based on genetic information.

Many businesses conduct such exams. But what many of these businesses do not realize is that regulations issued by the EEOC to enforce GINA prohibit employers from requesting information about family medical history. This prohibition includes questions for which it is reasonably likely to acquire such information. It also applies to virtually all inquiries employers make of employees and applicants such as, fitness for duty evaluations, examinations or inquiries in response to ADA reasonable accommodation requests, FMLA medical certifications, return to work exams, periodic annual medical exams, and even some purportedly “voluntary” wellness-related exams (unless the situation fits within a narrow GINA exception).

From the EEOC’s perspective, it may not matter if the request is made or the information received by the employer, or the employer’s agent, and the prohibition applies even if the request is not made directly to the employee or applicant. For example, EEOC regulations prohibit employers from searching employees’ or applicants’ social media sites for genetic information, although inadvertent acquisition does not violate the law.

Genetic discrimination is one of the six national priorities identified by the EEOC’s Strategic Enforcement Plan (SEP) and, therefore, an area employers should address as soon as possible through policy and training. However, employers should remember that GINA’s protections are not limited to prohibiting certain inquiries or acquisitions concerning genetic information. Genetic information can and frequently is acquired by employers. In that case, that information is subject to the same confidentiality requirements under the ADA. In addition, GINA has specific rules about when genetic information can be disclosed. So, employers need to be concerned not only about genetic information coming in, but also about how it can be used and whether it should be sent out.

Massachusetts Senator Elizabeth Warren recently introduced legislation which would ban employers from conducting credit checks of prospective employees during the hiring process.  Known as the Equal Employment for All Act, the measure would amend the Fair Credit Reporting Act to prohibit employers from using consumer credit reports to make employment decisions.  Notably, the Act would permit exceptions for certain positions, e.g. those requiring national security clearance.

According to Senator Warren,

It was once thought a credit history would provide insight into a person’s character and today, many companies routinely require credit reports from job applicants, but research has shown that an individual’s credit rating has little to no correlation with his or her ability to succeed in the workplace.  A bad credit rating is far more often the result of unexpected medical costs, unemployment, economic downturns, or other bad breaks than it is a reflection on an individual’s character or abilities.  Families have not fully recovered from the 2008 financial crisis, and too many Americans are still searching for jobs. This is about basic fairness — let people compete on the merits, not on whether they already have enough money to pay all their bills.

The legislation is supported by a number of worker advocacy and civil rights groups, although no companion measure has been introduced in the House.  The Act, would appear to mirror the goals of numerous state laws which already prohibit employers from utilizing credit information in making employment decisions.

A report issued by the Department of Health and Human Services Office of Inspector General (“OIG”) concludes that the Office for Civil Rights (“OCR”) did not meet all of its federal requirements for oversight and enforcement of the HIPAA Security Rule. While the report noted OCR met some of these requirements, it also found that:

  • OCR had not assessed the risks, established priorities, or implemented controls for its HITECH requirement to provide for periodic audits of covered entities to ensure their compliance with Security Rule requirements.
  • OCR’s Security Rule investigation files did not contain required documentation supporting key decisions because its staff did not consistently follow OCR investigation procedures by sufficiently reviewing investigation case documentation.

OIG also found that OCR had not fully complied with Federal cybersecurity requirements for its information systems used to process and store investigation data. The report recommended that OCR:

  • assess the risks, establish priorities, and implement controls for its HITECH auditing requirements;
  • provide for periodic audits in accordance with HITECH to ensure Security Rule compliance at covered entities;
  • implement sufficient controls, including supervisory review and documentation retention, to ensure policies and procedures for Security Rule investigations are followed; and
  • implement the NIST Risk Management Framework for systems used to oversee and enforce the Security Rule.

OCR’s Response. In its response to OIG’s findings, attached as an appendix to the report, OCR generally concurred with OIG’s recommendations and described actions it has taken to address them. OCR’s response to the report provides valuable information to companies as they develop their HIPAA compliance programs, including:

  • From 2008 through 2012, OCR obtained corrective action from covered entities in more than 13,000 cases where they found noncompliance with HIPAA and reached resolution agreements in 11 cases with payments totaling approximately $10 million.
  • The findings from the pilot audits OCR ran in 2012 indicate that covered entities generally have more difficulty complying with the Security Rule than other aspects of HIPAA and that small covered entities struggle with HIPAA compliance in each of the assessment areas – privacy, security and breach notification.
  • Future audits “are less likely to be broad assessments generally across the Rules and more likely to focus on key areas of concern for OCR identified by new initiatives, enforcement concerns, and Departmental priorities.”

OCR’s response also noted that no monies have been appropriated for a permanent audit program. However, covered entities and business associates should not see this lack of funding for a permanent audit program as giving them a pass on HIPAA compliance. The report makes clear that OCR must find a way to meet its audit requirements under HIPAA.

OCR’s recent enforcement activity also demonstrates a commitment to holding companies accountable under HIPAA. In 2013 (through December 20), OCR reached five resolution agreements with payments totaling approximately $3.7 million. These figures from a single calendar year represent nearly half the total number of resolution agreements and payments that OCR obtained over the five-year period from 2008 through 2012.

In this enforcement environment, it is imperative that covered entities and business associates regularly review their HIPAA compliance program and implement ongoing HIPAA training for their employees.

Fingerprints, voice prints and vein patterns in a person’s palm are three examples of biometrics that may be “moving into the consumer mainstream to unlock laptops and smartphones, or as a supplement to passwords at banks, hospitals and libraries,” reports Anne Eisenberg at the New York Times. Of course, these technologies, aimed at increasing security and, to a lesser degree, convenience, raise data privacy concerns and other risks. However effective, convenient, and efficient these technologies may be, companies need to think through carefully their adoption and implementation, particularly in the workplace.

Below are just a few of the kinds of questions companies should be asking before implementing technologies that involve capturing biometric information.  It is likely that such technologies will go mainstream and, if so, spawn new laws regulating the use of biometric information. Thus, companies using such technologies will need to continue to monitor the legal landscape to manage their risks.

Can we collect this information? In some cases, the answer may be no. For example, in New York, Labor Law Section 201-a prohibits the fingerprinting of employees by private employers, unless required by law. However, according to an opinion letter issued by the State’s Department of Labor on April 22, 2010, a device that measures the geometry of the hand is permissible as long as it does not scan the surface details of the hand and fingers in a manner similar or comparable to a fingerprint. Other states may permit the collection of biometric information provided certain steps are taken. The Illinois Biometric Information Privacy Act, for instance, prohibits private entities from obtaining a person’s or customer’s biometric identifier or biometric information unless the person is informed in writing and consents in writing.

If we can collect it, do we have to safeguard it?  Regardless of whether a statute requires a business to safeguard such information, we believe it is good practice to do so. However, states such as Illinois (see above) already require a reasonable standard of care when storing, transmitting or disclosing biometric information.

Is there a notification obligation if unauthorized persons get access to biometric information? In some states the answer is yes.  The breach notification statutes in states such as Michigan include biometric data in the definition of personal information. MCLS § 445.72

Are there any requirements for disposing of this information? Yes, a number of states (e.g., Colorado and Massachusetts) require that certain entities meet minimum standards for properly disposing records containing biometric information.

Can employees claim this technology amounts to some form of discrimination? In addition to securing devices and accounts, biometric technologies also are being used to track employee time and attendance in order to enhance workforce management. These different applications can form the basis of discrimination claims. For example, earlier in 2013, the U.S. Equal Employment Opportunity Commission (EEOC) claimed an employer’s use of a biometric hand scanner to track employee time and attendance violated federal law by failing to accommodate certain religious beliefs which opposed the use of such devices.

Retinal scan technology is another biometric technology that can be used for identification/security purposes.  However, as explained in a recent Biometric.com article, “examining the eyes using retinal scanning can aid in diagnosing chronic health conditions such as congestive heart failure and atherosclerosis…[as well as] diseases such as AIDS, syphilis, malaria, chicken pox and Lyme disease [and] hereditary diseases, such as leukemia, lymphoma, and sickle cell anemia.” Thus, the data captured by such scans can inform employers about the health conditions of their employees, raising a range of medical privacy, medical inquiry and discrimination issues under federal and state laws, such as the Americans with Disabilities Act. 

Privacy and data security issues and concerns do not stop at the water’s edge. Companies needing to share personal information, even when the sharing will take place inside the same “company,” frequently run into challenges when that sharing takes place across national borders. In some ways, the obstacles created by the matrix of federal and state data privacy and security laws in the U.S. are dwarfed by the matrix that exists internationally. Most countries regulate to some degree the handling of data, from access, to processing, to disclosure and destruction. And, the law continues to develop rapidly, sometimes due to unexpected events.

Take, for example, the U.S. Safe Harbor program that was designed to facilitate the transfer of personal data of individuals in the European Union (EU) to the United States. Because the EU believes that the law in some countries, including the U.S., fails to provide “adequate safeguards,” the general rule is that personal data of EU persons cannot be sent to the U.S. unless an exception applies. One exception is based on a negotiated deal between the EU and the U.S., commonly known as the U.S. Safe Harbor, a program which currently is in some jeopardy due to the recent reports of NSA monitoring, Snowden, etc.

Currently, to meet the Safe Harbor, a company must take certain steps, including (i) appointing a privacy ombudsman; (ii) reviewing and auditing data privacy practices; (iii) establishing a data privacy policy that addresses the following principles: notice, choice, onward transfer of data, security, integrity, access and enforcement; (iv) implementing privacy and enforcement procedures; (v) obtaining consents and creating inventory of consents for certain disclosures; and (vi) self-certifying compliance to the U.S. Department of Commerce.

A recent statement from Viviane Reding, European Commissioner for Justice, Fundamental Rights and Citizenship, quoted in The Guardian, October 17, 2013, signals some changes may be in store for the Safe Harbor:

The Safe Harbour may not be so safe after all. It could be a loophole because it allows data transfers from EU to US companies, although US data protection standards are lower than our European ones,” said Reding. “Safe Harbour is based on self-regulation and codes of conduct. In the light of the recent revelations, I am not convinced that relying on codes of conduct and self-regulation that are not policed in a strict manner offer the best way of protecting our citizens.

At the same time, the EU continues to update and strengthen its protections for personal data. Companies that operate globally need to be sensitive to not only complying with the laws specific to activities within a jurisdiction, but also to activities between jurisdictions. Common business decisions such as deciding where data will be stored, setting up global databases for employees medical, personnel and other information, arranging for enterprise-wide employee benefits or monitoring programs, can face significant obstacles relating to the interplay of the data privacy and security laws of the countries involved.

A familiar story – small health care provider suffers a data breach affecting patient data, reports incident to the federal Office for Civil Rights (OCR) and winds up becoming subject to an OCR investigation that goes well beyond the breach itself, resulting in a significant settlement payment and corrective action plan.

In this case, a relatively small adult and pediatric dermatology practice in Concord, Massachusetts has agreed to settle potential violations of the Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy, Security, and Breach Notification Rules, agreeing to a $150,000 payment and a comprehensive corrective action plan that is subject to OCR review.

So what did OCR allege the provider did wrong that led to this settlement and corrective action plan?

By way of background, on October 7, 2011, the provider reported to HHS a breach of its unsecured electronic protected health information (ePHI) that resulted when an unencrypted thumb drive that stored ePHI concerning surgeries of approximately 2,200 individuals was stolen from an employee’s car. The provider notified its patients within 30 days of the theft and provided media notice. On November 9, 2011, HHS notified the provider that OCR intended to investigate the provider’s compliance with the Privacy, Security, and Breach Notification Rules.

Providers and other covered entities need to realize that if they experience an unexpected theft or other event that results in a reportable breach, it may very well open them up to a compliance review by the OCR.

What potential violations of HIPAA did the OCR allege based on its investigation?

  • The provider did not conduct an accurate risk assessment until October 1, 2012.
  • The provider did not fully comply with the Breach Notification Rule, which includes having written policies and procedures and training workforce members regarding those policies and procedures until February 7, 2012.
  • The provider failed to reasonably safeguard the thumb drive that wound up being stolen.

Thus, the issue seems to be not so much whether the covered entities appropriately responded to the breach at hand, but whether they were compliant with the Privacy, Security, and Breach Notification Rules prior to the incident and could have avoided the breach. As suggested here, taking compliance steps after the incident will not shield the covered entity from OCR enforcement, although it may have softened the blow.

Lesson for providers and other covered entities: Don’t wait until you lose a thumb drive before getting compliant.