Two recent surveys provide some detailed analysis of cybersecurity and its impact in today’s world.

The Global State of Information Security Survey 2015, conducted by PricewaterhouseCoopers LLP (PWC),  found a 48% increase in the number of security incidents detected from 2013.  PWC surveyed more than 9,700 security, information technology and business executives found a total of 42.8 million security incidents detected on an annual basis.  While this figure appears astronomical, it does not include undetected attacks, which would only serve to increase this figure.  Many of these attacks result in what is commonly known as a data breach.

From a loss perspective, the Survey found the annual financial costs of investigating and mitigating security incidents increased substantially this year, particularly among large organizations, with the number of respondents reporting losses of $20 million or more almost doubling over 2013.  Notably, most respondents experienced a minimum of $50,000 in financial losses due to security incidents.

Notwithstanding the significant number of incidents detected and the related loss, the 2014 Critical Security Control Survey, conducted by the SANS Institute, found that only 26% of CEOs and top level managers are aware of cybersecurity risks and remediation obligations.  The Sans Survey, of 300 cybersecurity professionals, also found that less than 50% of companies have proper technological controls against malware and other malicious code and that 63% of companies say their in-house cybersecurity group lacks the necessary resources to assess and meet the cyber threat.

As we mentioned earlier this year, and as confirmed by each of these survey, organizations need to implement data incident response plans.  To this end, we have prepared a summary of some of the Key Action Items for Responding to Data Breaches.  While this list is not exhaustive, it should provide a general guide for incident response.

California Governor Jerry Brown signed AB-1710 into law yesterday amending its existing data breach notification statute. The most significant change – companies that experience a data breach must provide information in the notification that if identity theft prevention and mitigation services are provided, they must be provided for at least 12 months to affected persons at no cost if the breach exposed or may have exposed specified personal information. The new law also expands the application of safeguard requirements for personal information and further prohibits certain uses and disclosures of Social Security numbers. The new law becomes effective January 1, 2015.

Identity Theft Prevention and Mitigation Services (“Credit Monitoring”) Notification Mandate.

Currently California and 46 other states have laws that, in general, require entities that own or license certain personal information to notify individuals whose personal information has been involved in a data breach. The precise definitions of these or similar terms vary somewhat state-to-state. But none of the states have imposed a broadly applicable requirement by statute or regulation that entities facing a breach notification obligation must also provide credit monitoring services, or “identity theft prevention and mitigation services,” to affected persons. Of course, many companies have provided such services, and State Attorneys General have urged businesses to extend such services. What the law appears to require is that if identity theft prevention and mitigation services are provided, the notification must inform the affected persons that they will be provided for at least 12 months and at no cost, and the notice has to provide information on how to obtain the services.

The language adding the new “identity theft prevention and mitigation services” requirement is set forth at Cal. Civil Code 1798.82(d)(2)(G) and it reads:

If the person or business providing the notification was the source of the breach, an offer to provide appropriate identity theft prevention and mitigation services, if any, shall be provided at no cost to the affected person for not less than 12 months, along with all information necessary to take advantage of the offer to any person whose information was or may have been breached if the breach exposed or may have exposed personal information defined in subparagraphs (A) and (B) of paragraph (1) of subdivision (h).

The new requirement applies only if the breach involved Social Security numbers, driver’s license numbers or California identification card numbers, but not credit card account numbers or the other elements of personal information in the existing California law.

It is also interesting to note that California’s existing law provides that HIPAA covered entities (there is no mention of business associates here) are deemed to comply with “the notice requirements in subdivision (d)” of the California law, if they comply with the breach notification obligations under HIPAA. Subdivision (d) refers to Cal. Civil Code 1798.82(d), the same section which contains the new identity theft prevention and mitigation services notification requirement. It is unclear, however, whether the reference to subdivision (d) would include the identity theft prevention and mitigation services notification requirement, since that seems to create an obligation beyond the notice requirement where identity theft prevention and mitigation services are offered. Covered entities have to be careful here and also consider the preemption provisions under HIPAA.

Safeguarding Personal Information.

Prior to AB-1710, California required businesses that own or license personal information about a California resident to implement and maintain reasonable security procedures and practices appropriate to the nature of the information, to protect the personal information from unauthorized access, destruction, use, modification, or disclosure. To own or license meant that the business retained personal information as part of the business’ internal customer account or for the purpose of using that information in transactions with the person to whom the information relates. AB-1719 expands this requirement to businesses that “maintain” personal information. That is, personal information that a business maintains but does not own or license. This is a significant expansion of the safeguards requirement and businesses maintaining the personal information of California residents should be taking steps to safeguard that information, whether it applies to customers, employees, students, or other residents. For this purpose, personal information means:

an individual’s first name or first initial and his or her last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted or redacted:

(A) Social security number.

(B) Driver’s license number or California identification card number.

(C) Account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account.

(D) Medical information (any individually identifiable information, in electronic or physical form, regarding the individual’s medical history or medical treatment or diagnosis by a health care professional).

Note, however, that this obligation does not apply to providers of health care, health care service plans, or contractors regulated by the Confidentiality of Medical Information Act.

Prohibitions on Sale and Marketing of Social Security Numbers.

California also maintained specific protections for Social Security numbers prior to AB-1710, including prohibiting persons or entities from publicly posting or displaying an individual’s Social Security number or doing certain other acts that might compromise the security of an individual’s Social Security number, subject to certain exceptions.

AB-1710 amends those protections to prohibit the sale, advertisement for sale, or offer to sell an individual’s Social Security number. The prohibition does not extend to the release of Social Security numbers when the release is incidental to a larger transaction and is necessary to identify the individual in order to accomplish a legitimate business purpose. This exception might apply, for example, in the course of a sale of a business and records containing Social Security numbers are released to the buyer. However, release of an individual’s Social Security number for marketing purposes is not permitted. Additionally, the release of an individual’s Social Security number for a purpose specifically authorized or specifically allowed by federal or state law is not prohibited by this change.

States have been amending their breach notification and data security laws over the past few years, likely in response to the data breaches constantly in the media and the large number of complaints of identity theft being received by federal and state agencies. See, e.g., recent amendments to Florida’s law. Companies need to be aware of these changes and start reviewing and updating their security incident response plans, as well as their overall risk assessment, particularly in California, where the law now may require a more costly response in certain cases.

The Department of Health and Human Services (“HHS”) recently released guidance on the application process to obtain a Health Plan Identifier (“HPID”).  A HPID is an all-numeric 10-digit identification number that many HIPAA-covered health plans are required to adopt by November 5, 2014.  Think of a HPID like an EIN for health plans.  HPIDs will be used in all HIPAA standard transactions, such as the payment of health care claims, claim status checks, health plan eligibility confirmations, and premium payments.

The HPID requirement is another product of the Affordable Care Act and seeks to reduce administrative costs by promoting electronic transactions between medical providers and health plans.  To acquire HPIDs for their health plans, plan sponsors will have to register with the Centers for Medicare and Medicaid Services’ (“CMS”) Health Plan and Other Entity Enumeration System (“HPOES”) available through the CMS Enterprise Portal.

It is fair to say that prior to this new guidance the instructions for the application process were not exactly easy to follow.  This new two-page document, however, navigates users through the HPID application process step-by-step.  In essence, employers will register their organization, identify approved users in the web portal and their roles, and designate an “Authorizing Official User” to act on behalf of the organization in approving/submitting applications.

HPIDs are not required for every health plan, only Controlling Health Plans (“CHP”).  A CHP is a health plan that either controls its own business activities or is not controlled by an entity that is not a health plan and exercises sufficient control over any Subhealth Plans (“SHP”).  A SHP is simply a health plan whose business activities are controlled by a CHP and obtaining a HPID for SHPs is optional.

Making the HPID optional for SHPs recognizes that employers can structure their health plans in a variety of different ways.  For instance, a welfare benefit plan that has three medical benefit arrangements is only required to obtain a single HPID for the welfare benefit plan.  The employer could, however, assign separate HPIDs to each medical arrangement if it would simplify claims administration, or any other reason.  For most entities, coordination with the third-party claims administrator will determine whether obtaining a SHP has any benefit.

HPIDs will be required to be used in HIPAA standard transactions beginning November 7, 2016.  It is the obligation of the HIPAA covered entity to use an HPID in the electronic HIPAA transactions and ensure that business associates of the entity are also using a HPID.

When many people think about identity theft and data breaches, they tend to think about credit card data and bank accounts. This makes sense given the large-scale breaches in the news lately. However, Reuters reported last week that medical information is “worth 10 times more than [] credit card number[s] on the black market” a trend that has been developing for some time. This makes health care providers and their business associates increasingly likely to be targets of an attack. Small businesses in this industry are not immune as even a solo practitioner can amass data on thousands of patients. See NYT article making this point and providing some helpful strategies.

Like financial institutions, insurance companies, and retailers, businesses in the healthcare industry maintain vast amounts of sensitive data including health insurance policy numbers, social security numbers, birth dates and other billing information, not to mention sensitive diagnosis information. As healthcare costs continue to rise, the opportunity to use another’s identity, policy or account to obtain healthcare products and services is a strong driver of the value of this data on the black market. In addition, providers and other health care businesses generally are not as advanced as banks and financial institutions in safeguarding individually identifiable health information, or spotting identity theft. As data is not perishable, and this sector is reported to generally be slower in reacting, identity thieves tend to have a longer time frame to use the information.

The increasing exposure for businesses in the healthcare industry is evident in recent studies by the Ponemon Institute which show cyber attacks have risen from 20 percent in 2009 to 40 percent in 2013, as noted in the Reuters article. Other reports highlight increases in HIPAA breaches. See also MelaMedia’s helpful collection of statistical information concerning HIPAA data breaches and other metrics.

Clearly, the healthcare industry will need to continue to address this increasing threat, although static budgets and strapped resources of course present significant challenges. For organizations that have not already worked through a HIPAA compliance program, there is a bunch of low hanging fruit that can be adopted with relative ease and low expense to safeguard data. Creating adequate safeguards and a culture of privacy and security does not happen overnight. It requires buy-in and leadership from senior management, a careful understanding the organization’s risks and vulnerabilities, knowing what the law requires, coordination with key persons inside the organization and certain third parties outside the organization, frequent and regular security awareness and training, and regular re-evaluation of the organization’s approach for changed circumstances.

On January 1, 2015, Delaware employers who dispose of records which contain the unencrypted personal identifying information of employees must take steps to ensure the privacy of such information.  The bill, H.B. 294, was recently signed by Delaware’s Governor Jack Markell.

The new law defines personal identifying information as an employee’s first name or first initial and last name in combination with one of the following data elements that relate to the employee, when either the name or the data elements are not encrypted:

  • the employee’s signature;
  • full date of birth;
  • social security number;
  • passport number;
  • driver’s license or state identification number;
  • insurance policy number;
  • financial services account number;
  • bank account number;
  • credit card number;
  • debit card number;
  • any other financial information; or
  • confidential health care information.

Under the law, employers are required to take reasonable steps to destroy or arrange for the destruction of an employee’s personal identifying information when in a “tangible medium,” or that is stored in an electronic or other medium and is retrievable.   Destruction is to be by shredding, erasing, or otherwise modifying the personal identifying information to make it entirely “unreadable or indecipherable” through any means.  Importantly, the law permits a private right of action for any employee who incurs actual damages due to the reckless or intentional violation of this statute.

Delaware also enacted a companion bill, H.B. 295, in July which imposed the same requirements for the proper destruction of personal data on Delaware businesses disposing records containing consumers’ personal identifying information.

Both of these statutes are aimed at addressing one of the more common ways in which a business may experience a data breach, namely the improper disposal of records.  Notably, both of this measures include broader definitions of personal identifying information than Delaware’s data breach notification statute which only includes the following data elements:  Social Security number; driver’s license number or Delaware Identification Card number; or account number, or credit or debit card number, in combination with any required security code, access code, or password that would permit access to a resident’s financial account.

Upon enactment, Delaware joins the list of 30 other states which in some way regulate the disposal of personal information.

In order to be a “protected computer” within the meaning of the federal Computer Fraud and Abuse Act (the “CFAA”), the computer must be used in interstate commerce at the time of the allegedly unauthorized access of the computer, the U.S. District Court for the District of Massachusetts held.  Pine Env. Servs., LLC v. Charlene Carson and Palms Env. and Survey, LLC, No. 1:14-cv-12830-IT (D. Mass. August 20, 2014).

Defendant Charlene Carson was employed by Plaintiff.  When she resigned her employment to join a competitor, she did not return her company-owned laptop.  Approximately two months after leaving her employment, Carson’s roommate observed her in their apartment working on the laptop.  The roommate left the room and when he returned found a note from Carson asking that he return the laptop to Plaintiff.

After Carson’s roommate returned the laptop, Plaintiff performed a forensic analysis of the laptop and learned that a software program called CCleaner was installed on the laptop and was used after Carson’s last day of employment with Plaintiff to destroy data and files, the internet browsing history, and event log entries on the laptop.  Plaintiff brought several state law claims against Carson and her new employer as well as a CFAA claim.  The CFAA protects computers that are used in or affect interstate commerce or communication from unauthorized use or access.

The CFAA provides a private right of action in certain situations where there is a loss of at least $5,000 when someone (1) knowingly and with intent to defraud, accesses a protected computer or exceeds authorized access and by such means furthers the intended fraud and obtains anything of value; or (2) knowingly causes the transmission of a program, information, code, or command, and as a result intentionally causes damage without authorization to a protected computer; or (3) intentionally accesses a protected computer without authorization and as a result causes damage and loss.  Plaintiff asserted the laptop was a protected computer because the company was engaged in providing rental equipment to other businesses throughout the country, Plaintiff’s principal place of business was in a different state from the one in which Carson lived and worked, and the laptop was used in interstate commerce and communication.

The court dismissed the CFAA claim because the laptop was only being used in interstate commerce when Carson was employed by Plaintiff.  Carson’s use of the laptop during her employment was authorized.  The unauthorized use of the laptop happened after the end of Carson’s employment with Plaintiff and thus occurred at a time when the laptop was not being used in interstate commerce.  The court found that the fact that the laptop formerly was used in interstate commerce did not make the later deletion of files from the laptop an action that was “interstate” in nature.

This decision highlights the importance of requiring employees to return all company-owned devices immediately upon their separation from employment.

 

When the U.S. Supreme Court decided United States v. Windsor, it declared section 3 of the Defense of Marriage Act (DOMA) to be unconstitutional. For many companies, the decision meant changes to certain of their employee benefit plans, as well as the tax treatment of employee contributions for same sex spouses. However, declaring section 3 of DOMA unconstitutional reached well beyond ERISA-covered benefit plans, changing the applciation of many federal laws, including the HIPAA Privacy Rule. Today, the Office for Civil Rights (OCR) provided guidance concerning Windsor’s application to the HIPAA Privacy Rule.

Under the HIPAA Privacy Rule, covered entities can share information about a patient’s care with the patient’s family members in various circumstances – those family members include spouses. In addition, the Privacy Rule provides protections against the use of genetic information about an individual. Genetic information includes certain information about family members (including, spouses) of the individual.

Based on the holding in Windsor, when the Privacy Rule uses the terms “spouse” and “marriage,” such as at 45 CFR 160.103 (definition of family member), lawful same-sex spouses have to be included. More specifically, the term “spouse” includes individuals who are in a legally valid same-sex marriage sanctioned by a state, territory, or foreign jurisdiction (as long as, as to marriages performed in a foreign jurisdiction, a U.S. jurisdiction would also recognize the marriage). The term “marriage” includes both same-sex and opposite-sex marriages, and family member includes dependents of those marriages. All of these terms apply to individuals who are legally married, whether or not they live or receive services in a jurisdiction that recognizes their marriage.

The OCR guidance clarifies, for example, that in connection with the standard concerning uses and disclosures to those involved in an individual’s care (45 CFR §164.510(b)), in cases where covered entities are permitted to share an individual’s protected health information with a family member of the individual, family member includes legally married same-sex spouses, regardless of where they live.

Covered entities and business associates should review their practices and alert their workforce members of this development. OCR intends to issue additional clarifications through guidance or rulemaking to address same-sex spouses as personal representatives under the Privacy Rule.

 

You may have been reading about how “Big Data” technologies are being used for a variety of purposes, such as making purchase suggestions based on prior buying patterns or staging law enforcement resources based on predictions for where and when crimes are likely to occur. But these technologies also are being used in the human resources context, such as to better select and manage applicants and employees, and can be of significant value to human resources leaders, and the company generally. Of course, there are mixed views about the use of this technology, as well as legal risks that should be considered.

Earlier this year, for example, a Forbes article explored the concern that if too heavy a weight is placed on “data” in the recruiting process, the human element can be lost and the business might not be capturing the top talent for the position. Others have observed that analytics tools in this context fall short in that they “don’t directly assess whether a person can do a job” and base recommendations on correlations that might not translate into good performance.

Certainly the role big data analytics tools can and should play in the workplace will depend on a range of factors, not the least of which is whether they can actually produce results. Employers that are considering whether these tools can positively impact HR decision making should also be considering the applicable risks when using this technology, even if “big data’s” recommendations are only one of many factors in the ultimate decision.

Attorneys at the EEOC, for example, are already considering the potential ways that using “big data” tools can violate existing employment laws, such as Title VII of the Civil Rights Act of 1964, the Age Discrimination in Employment Act, the American with Disabilities Act and the Genetic Information Nondiscrimination Act. Law360 recently reported (registration required) on comments made by EEOC Assistant Legal Counsel Carol Miaskoff who discussed these potential risks and others during a workshop hosted by the Federal Trade Commission. There are, of course, a range of other potential issues including employee relations, labor relations, privacy and so on. At a minimum, employers need to proceed cautiously and be sure to maintain records that can verify their decisions were made lawfully.

I recently had the pleasure of speaking to a great group at the Connecticut Assisted Living Association (CALA) about HIPAA and a range of related practical issues. Many covered entities and business associates, particularly those that are small businesses, continue to work on understanding the privacy and security standards, and how to best apply them in their businesses and with their varied workforces. Compliance can be challenging, but it is important to get started and document the compliance steps taken. Here are some reminders about HIPAA privacy and security compliance:

  • Risk assessment. This is a critical step required under the security regulations. Many covered entities and business associates focus first on written policies and procedures to safeguard protected health information (PHI). But those policies and procedures need to address the risks and vulnerabilities to PHI, which can only be determined through an appropriate risk assessment. Of course, organizations need to continually assess their risks and vulnerabilities as their businesses change and grow.
  • Business Associate Agreements. The Health Information Technology for Economic and Clinical Health (HITECH) Act made a number of changes affecting “business associates.” Among those changes were updates to the “business associate agreements” that the HIPAA Rules require covered entities to maintain with their business associates, which could include claims administrators, consultants, cloud and other data storage providers. The final HIPAA regulations established a transition rule that permitted covered entities and business associates to continue to operate under certain existing business associate agreements for up to one year beyond the compliance date of the final regulations (September 23, 2013). That transition period ends this month. Accordingly, it is critical that business associate agreements be updated.A starting point for business associate agreement compliance is the set of sample provisions posted by the Office of Civil Rights. However, there are other issues that parties to the business associate agreement will want to address, such as, data breach coordination and response, indemnity, and agency status. Additionally, a number of state laws (e.g., California, Massachusetts, Maryland) require businesses to have contracts with third-party service providers to safeguard personal information, which likely will include information in addition to protected health information under HIPAA.
  • Data Breach Preparedness. Data breaches continue to happen across the country, exposing vast amounts of sensitive data. HIPAA regulations and laws in 47  states require a number of steps to be taken when a breach happens including notifying the affected individuals and certain governmental agencies. Absent a plan for responding, companies often find themselves ill-prepared to respond timely, correctly and completely. Responding timely is particularly important for avoiding an inquiry from a federal or state agency concerning a data breach. Having a plan and practicing that plan can significantly enhance a company’s ability to respond and minimize its exposure following a breach.
  • OCR AuditsIt is expected that the Office for Civil Rights, which enforces the HIPAA privacy and security rules, will be resuming its audit program this fall – which applies to both covered entities and business associates. There are many steps covered entities and business associates can take to be audit ready. Good documentation is one of the most important. OCR wants to be able to see that the organization has taken steps to address the standards under the privacy and security rules. A documented risk assessment, written policies and procedures, and sign-off sheets showing workforce members went through HIPAA training are all examples of documentation an OCR investigator would be expecting to find as part of the audit.

Being “compliant” is no small task, especially as each business has its own particular needs, risks, vulnerabilities, environments, and circumstances that have to be considered. Compliance for an assisted living facility, for example, might look a bit different than it does for a large metropolitan hospital, but many of the fundamental principles are the same.  The key is to get started, understand the risks to PHI, address those risks in a manner appropriate to the organization (one hundred and fifty pages of policies and procedures is not appropriate for many organizations) and under each of the required standards, implement appropriate policies and procedures, and document.