Nebraska Amends Data Breach Notification Law

On April 13, 2016, Nebraska’s breach notification statute was amended when Governor Pete Ricketts signed LB835 into law.  The Amendment included a variety of changes, including a regulator notification requirement and broadens the definition of “personal information” in the state data breach notification statute, Neb. Rev. Stat. §87-802 – 87-804. These amendments become effective on July 20, 2016.

Specifically, the bill makes the following changes:

  • Attorney General Notification. The amendment requires notice to the State’s Attorney General concurrent with notice provided to affected individuals. These notices must be provided as soon as possible and without unreasonable delay consistent with law enforcement needs and the time necessary to determine the scope of the breach. This change follows a number of other states, such as California, Connecticut, Florida, Indiana, Maryland, Massachusetts, New Hampshire, New York, and North Carolina, which also require notification to the respective state’s Attorney General or other agency. Because the timing, form, content and manner of delivery of these notices vary state to state, organizations should take agency notifications into account when engaging in breach preparedness planning.
  • Personal Information Definition Expanded. The definition of “personal information” was amended to add a user name or email address, in combination with a password or security question and answer that would permit access to an online account, which, if acquired by an unauthorized person, would require notice. Recognizing the breadth of information consumers store online, Nebraska will become the fifth state, joining California, Florida, Nevada and Wyoming to require notification in the event of a breach of account credentials.
  • Encryption Exception Clarified. As amended, the state’s breach notification law provides that data will not be considered encrypted for purposes of avoiding notification if the breach of security includes acquisition of the encryption key or confidential encrypted method.

The notice obligations that are triggered when organizations have a breach of the security of their systems involving personal information continue to evolve. Preparedness is key so take some to develop a response plan, and practice it.

Employers Beware of Phishing Scams

On April 20, 2016, a class action lawsuit was filed in the United States District Court, Southern District of California against Sprouts Farmers Market, Inc. The lawsuit was initiated by a former employee whose W-2 was allegedly disclosed as part of a phishing scam that occurred in late March 2016 amid reports that Sprouts’ employees had their IRS tax refunds stolen. According to the complaint, the W-2s of Sprouts’ employees were disclosed to a third party as a result of the phishing scam.

This sort of internet scam, referred to as “phishing,” occurs when someone attempts to acquire sensitive or confidential information under the guise of a legitimate request. For the average internet user, phishing scams often come in the form of a fake email from a bank or other financial institution asking you to click on a link to confirm your password on a web site that looks like a legitimate web site for the business. The fake web site often uses the actual logos and branding from a legitimate site to trick the user.

In this case, the complaint alleges an email was sent to an employee in the payroll department asking for the W-2s of all Sprouts’ workers by a Sprouts executive. The employee responded to the email sending the W-2s of approximately 21,000 Sprouts employees. Unfortunately, Sprouts later discovered that the original email requesting the information was not legitimate, and notified the authorities.

The class action complaint alleges that Sprouts was negligent in its protection of private employee information, violated California Civil Code sections 1798.80 et seq. (including California’s data breach law), and engaged in unfair business practices in violation of California Business and Professions Code section 17200. The complaint alleges that while Sprouts offered credit monitoring services for 12 months for the impacted employees, the service chosen did not protect against identity theft, and only notifies the consumer after identify theft or other fraudulent activity has occurred. The complaint also alleges that Sprouts had “lax” security procedures for its employee data, and concealed that fact from its employees.

This case highlights the necessity that employers have protocols in place to protect employee information, and the risks associated with not having such protocols in place.

 

EEOC Files Suit Targeting Employment Application “Health History”

On March 22, 2016, the Equal Employment Opportunity Commission (“EEOC”) filed suit in the United States District Court for the Western District of Missouri against Grisham Farm Products, Inc. alleging that its employment application violated the Americans With Disabilities Act (“ADA”) and the Genetic Information Non-Discrimination Act (“GINA”). Equal Employment Opportunity Commission v. Grisham Farm Products, Inc. 16-cv-03105.  According to the EEOC’s Complaint, Grisham

violated the ADA and GINA by requiring job applicants . . . to fill out a three-page ‘Health History’ before they would be considered for a job.

Plaintiff applied for a warehouse position at Grisham. The application contained 43 “yes or no” health-related questions. The questions were ones that might be seen when visiting a physician for the first time. For example, the EEOC’s Complaint alleges that the application inquired whether in the past 10 years, the applicant had (alphabetically) allergies, arthritis, bladder infections, eating disorders, gallstones, sexually transmitted diseases, etc. The form also inquired about prior hospitalizations, HIV infection, treatment for alcoholism, and whether the applicant “consulted a doctor, chiropractor, therapist, or other health care provider in the past 24 months.”

The application’s Health History section stated in large letters that:

All question must be answered before we can process your application.

According to the EEOC, after answering the first question, plaintiff stopped. Plaintiff had medical conditions and disabilities he would have revealed had he fully and completely answered each question. The EEOC claims that the plaintiff believed he did not have to reveal his medical history to any potential employer. As such, he telephoned Grisham Farm and a company representative with whom he spoke said that if the health history was not fully completed, it would not accept his Application. Accordingly to the Complaint, Sullivan refused to complete the health history.

In addition to requesting a permanent injunction against Grisham Farm from making any pre-employment medical inquiries, the EEOC suit seeks monetary and punitive damages for the plaintiff.

In a statement issued in conjunction with the filing of the Complaint, the EEOC referred to the health form as being “among the most egregious we have seen.” This case should serve as a reminder to employers that pre-employment health inquiries can be made only after a conditional offer has been made, if the inquiries are made to all applicants for that job category, and provided the inquiries are job-related and consistent with business necessity.

Tennessee Amends Breach Notification Statute

On March 24, 2016, Tennessee’s breach notification statute was amended when Governor Bill Haslam signed into law S.B. 2005.

Under the amendment, notification of a data breach must now be provided to any affected Tennessee resident within 45-days after discovery of the breach (absent a delay request from law enforcement).  Previously, and like the vast majority of states, Tennessee’s statute required disclosure of a breach to be made in the most expedient time possible and without unreasonable delay.  Florida, like the Volunteer State, previously amended its breach notification statute to also require notification within a set time period.

Perhaps even more important than the specific timing requirement for notice, S.B. 2005 also amends Tennessee’s statute to remove the provision in the existing statute requiring notice only in the event of a breach of unencrypted personal information.  Accordingly, by expanding this provision, it appears Tennessee will be the first state in the country to require breach notification regardless of whether or not the information subject to the breach was encrypted.

Lastly, the bill also amends the statute to specify an “unauthorized person” includes an employee of the information holder who is discovered to have obtained personal information and intentionally used it for an unlawful purpose.  This amendment is likely focused on entities which failed to provide notification of data incidents which were the result of improper access by employees.

The law takes effect July 1, 2016.

FCC Chair Proposes New Broadband Rules

One year ago, in March 2015, the Federal Communications Commission (“FCC”) reclassified broadband Internet access service as a common carrier Telecommunications Service subject to regulation under Title II of the Communications Act.  At that time, however, the FCC recognized that the then-current rules were not well suited to broadband privacy.  On March 10, 2016, the FCC’s Chairman Tom Wheeler circulated for consideration by the full Commission a Notice of Proposed Rulemaking (“NPRM”) that effectively represents the start of the process of adopting rules suitable to broadband service.

The proposed rules would be built on three core principles: Customer choice, transparency, and data security.

Choice – Internet Service Providers (“ISPs”) would be required to provide customers with varying degrees of choice (i.e., no consent required, opt-out or opt-in), depending on how the customer’s personal information is used.

Transparency — ISPs would be required to disclose in “an easily understandable and accessible manner” the types of information they collect, how they use that information, and the circumstances in which they will share customer information with third parties.

Security — The proposal would require broadband providers to take reasonable steps to safeguard customer information from unauthorized use or disclosure. And, at a minimum, the proposal would require broadband providers to adopt risk management practices; institute personnel training practices; adopt strong customer authentication requirements; identify a senior manager responsible for data security; and take responsibility for use and protection of customer information when shared with third parties.

In order to encourage ISPs to protect the confidentiality of customer data, and to give consumers and law enforcement notice of failures to protect such information, the Chairman’s proposal includes common-sense data breach notification requirements. Specifically, in the event of a breach, providers would be required to notify:

  • Affected customers of breaches of their data no later than 10 days after discovery.
  • The Commission of any breach of customer data no later than 7 days after discovery.
  • The Federal Bureau of Investigation and the U.S. Secret Service of breaches affecting more than 5,000 customers no later than 7 days after discovery of the breach.

The proposed rule would apply exclusively to providers of broadband Internet access service and not to providers such as Amazon and Facebook or other operators of social media websites.

The proposal will be voted on by the full Commission on March 31, and, if adopted, would be followed by a period of public comment.

Check Your Spam Filter, You Might Have Been Selected for a HIPAA Audit!

Yesterday, the federal Office for Civil Rights (OCR) announced Phase 2 of its HIPAA Audit Program (Program). In its announcement, the OCR reports that the Program is underway and provides some helpful FAQs for covered entities and business associates about the Program. Preparation is critical and there are some key points covered entities and business associates should focus on.

Every covered entity and business associate is eligible for an audit. So, don’t think that because you are a small health care provider or sponsor a group health plan for employees you will be out of the Program’s reach. Auditee selection will be based on a number of criteria including include size of the entity, affiliation with other healthcare organizations, the type of entity and its relationship to individuals, whether an organization is public or private, geographic factors, and present enforcement activity with OCR. The OCR appears to be looking to examine a healthy cross-section of covered entities and business associates. On the bright side, OCR stated it will not commence an audit under the Program where there is an open complaint investigation or a current compliance review.

Potential auditees will be screened. OCR may send a questionnaire to covered entities asking them to identify their business associates and provide their contact information. OCR warns that if it does not receive responses to these requests it will use publically available information to create its audit pool, and nonresponsive entities still may be selected for an audit or subject to a compliance review. In fact, OCR informs covered entities and business associates that it expects them to check their junk or spam email folders for OCR communications about the Program.

…we expect you to check your junk or spam email folder for emails from OCR

The Program will include Desk Audits, followed by On-site Audits. The first stage of the Program will involve desk audits for covered entities, followed by desk audits for business associates, all of which will be completed by year end. After that, audits will be onsite and will examine a broader scope of requirements from the HIPAA Rules than desk audits. Some desk auditees may be subject to a subsequent onsite audit. The audits will examine compliance with specific requirements of the HIPAA Privacy, Security, or Breach Notification Rules. So, for example, OCR might want to look at your documented risk assessment, or your breach notification response plan. Auditees will be notified of the subject(s) of their audit in a document request letter, but OCR confirmed the audits will not cover compliance with state privacy laws.

Consider the audit process and timeline. Covered entities and business associates selected for a desk audit should expect to receive an email informing them of the selection and requesting documents and other data. Auditees will be able to submit documents on-line via a secure audit portal on OCR’s website. OCR expects that the documents and data will be provided within 10 business days of the request.

After submitting the documents and data, auditees will receive draft findings from OCR. Auditees will then have 10 business days to review and return written comments to the auditor. Auditees should expect to receive a final audit report within 30 business days.

Onsite audits will follow a similar process. The auditors will schedule an entrance conference to discuss the audit, which can be expected to take place over three to five days onsite, depending on the size of the entity. These will be more comprehensive and cover a wider range of requirements from the HIPAA Rules. Like the desk audit, entities will have 10 business days to review the draft findings and provide written comments, and they will be provided a final audit report.

Don’t want to respond? Entities that do not respond to OCR communications still may be selected for audit or be subject to a compliance review. As noted, the agency will use public means to find you.

We’ve been audited, now what? OCR states that the Program is primarily a “compliance improvement” activity, through which it can better understand compliance efforts, and determine what types of technical assistance should be developed and what types of corrective action would be most helpful. Of course, if OCR finds a serious compliance issue, it may initiate further investigation.

There may be publicity surrounding audits. OCR states that it will not post a list of audited entities or the findings of an individual audit identifying the audited entity. However, OCR reports that it will comply with Freedom of Information Act (FOIA) requests which could make the results of your audit public.

For now, covered entities and business associates should be on the look-out for communications from OCR and be prepared to respond. It goes without saying that they also should use this as an opportunity to assess their compliance and take steps now to address any gaps.

Should We Train Our Employees About Good Data Privacy and Security Practices?

Yes! It is the law in more places and circumstances than you suspect.

Late last year, The Wall Street Journal reported on a survey by the Association of Corporate Counsel (“ACC”) that found “employee error” is the most common reason for a data breach. CSOOnline reported on Experian’s 2015 Second Annual Data Breach Industry Forecast, stating:

“Employees and negligence are the leading cause of security incidents but remain the least reported issue.”

According to Kroll, in 31% of the data breach cases it reviewed in 2014, the cause of the breach was a simple, non-malicious mistake. These incidents were not limited to electronic data – about one in four involved paper or other non-electronic data.

No business wants to send letters to individuals – employees or customers – informing them about a data breach. Businesses also do not want to have their proprietary and confidential business information, or that of their clients or customers, compromised. Unfortunately, no “silver bullet” exists to prevent important data from being accessed, used, disclosed or otherwise handled inappropriately – not even encryption. Companies must simply manage this risk though reasonable and appropriate safeguards. Because employees are a significant source of risk, steps must be taken to manage that risk, and one of those steps is training.

It is a mistake to believe that only businesses in certain industries like healthcare, financial services, retail, education and other heavily regulated sectors have obligations to train employees about data security. A growing body of law coupled with the vast amounts of data most businesses maintain should prompt all businesses to assess their data privacy and security risks, and implement appropriate awareness and training programs.

Data privacy and security training can take many forms. Here are some questions to ask when setting up your own program, which are briefly discussed in the report at the link above:

  • Who should design and implement the program?
  • Who should be trained?
  • Who should conduct the training?
  • What should the training cover?
  • How often should training be provided?
  • How should training be delivered?
  • Do we need to document the training?

No system is perfect, however, and even a good training program will not prevent data incidents from occurring. But the question you will have to answer for the business is not why didn’t the company have a system in place to prevent all inappropriate uses or disclosures. Instead, the question will be whether the business had safeguards that were compliant and reasonable under the circumstances.

The Inexplicit Requirement and Definitive Necessity for Employers to Implement Privacy Policies

In the face of seemingly daily news reports of company data breaches and the mounting legislative concern and efforts on both the state and federal level to enact laws safeguarding personal information maintained by companies, employers should be questioning whether they should implement privacy policies to address the protection of personal information they maintain on their employees.

To date, there is no all-encompassing federal privacy law. Rather, there are several federal laws which touch upon an aspect of protecting personal or private information collected from individual, such as the Children’s Online Privacy Protection Act (giving parents control over the information collected from their children online); Federal Trade Commission Act (pursuant to which the FTC has sought enforcement against companies who failed to follow their own privacy policies relating to consumers); Gramm-Leach-Bliley Act (requiring financial institutions, such as banks, to protect consumer financial information); Health Insurance Portability and Accountability Act of 1996 (requiring covered entities to protect individually identifiable health information); and the Americans with Disabilities Act and Family and Medical Leave Act (requiring confidentiality of employee medical information obtained by employer).

State legislatures have likewise used a piecemeal approach at attacking the problem by some states mandating the protection of social security numbers, protecting credit card information, protecting consumer financial information, and securing personally identifiable information (usually aimed at preventing identity theft). Additionally, forty seven (47) states now have laws addressing notification and other requirements when a data breach occurs. While only a handful of states explicitly require a written privacy policy (such as Connecticut when collecting social security numbers and Massachusetts in connection with a written information security program), the overwhelming majority of states inexplicitly require privacy policies by requiring security of personal information (such as California which now requires encryption) and notification when a breach of personal information has occurred. As such, where companies are required to notify affected individuals of a breach, they are implicitly required to protect the information to prevent such a breach. The first step in assembling that protection armor is to institute a privacy policy.

Employers maintain various types of personally identifiable information on their employees, including, but not limited to: names, dates of birth, social security numbers, addresses, telephone numbers, financial information (such as bank account numbers and credit/debit card numbers), email addresses and passwords, driver’s license, state issued identification and passport numbers, health insurance number, biometric data, and personally identifiable information on an employee’s spouse and/or children (most commonly contained in benefit enrollment forms), and any other information maintained about an individual that could be used to identify him/her or obtain access to an online account.

Employer privacy policies should at a minimum address: (1) the types of personal information, (such as that listed above), whether in electronic or paper format, obtained and maintained regarding employees and their family members; (2) where the information is maintained/stored; (3) how the information is protected both while being maintained and also when being transferred from the employee to the employer, between the employer’s systems/departments, and outside of the employer’s organization (such as to a third party vendor); (4) who has access to the information, including any outside vendors who perform personnel-related services for the employer; (5) the effective date of the policy; and (6) identify the individual within the organization responsible for compliance with the policy.

Additionally, employers should consider training their employees on the policy. Employees who handle private information in the course of their employment should be trained on the contents of the policy; importance of maintaining the privacy of the information; methods to be used to achieve the protection of such information; limiting disclosure of the information within the duties performed by the employee with respect to use of the information; and what to do when a suspected breach of the information has occurred. The general employee population should also be trained on the contents of the policy; the importance of maintaining the privacy of the information; and what to do if the employee suspects or has knowledge that the information has been breached.

Internet of Things Bill Introduced

Recognizing the growing number of connected and interconnected devices, a bipartisan group of Senators recently introduced a bill which would convene a working group of Federal stakeholders to provide recommendations to Congress on how to appropriately plan for and encourage the proliferation of the Internet of Things (IoT).

The Developing Innovation and Growing the Internet of Things Act (DIGIT Act) would require the working group to examine the IoT for current and future spectrum needs, the regulatory environment (including identification of sector-specific regulations, Federal grant practices, and budgetary or jurisdictional challenges), consumer protection, privacy and security, and the current use of technology by Federal agencies and their preparedness to adopt it in the future.

While the working group would seek representatives from the Department of Transportation (DOT), the Federal Communications Commission (FCC), the Federal Trade Commission (FTC), the National Science Foundation, the Department of Commerce, and the Office of Science and Technology, the working group would also be required to consult with non-Governmental stakeholders – including: i) subject matter experts, ii) information and communications technology manufactures, suppliers, and vendors, (iii) small, medium, and large businesses, and (iv) consumer groups.  The findings and recommendations of the working group would be submitted to the appropriate committees of Congress within one year of the bill’s enactment.

Additionally, the DIGIT Act would also direct the FCC, in consultation with the National Telecommunications and Information Administration, to conduct its own study to assess the current and future spectrum needs of the IoT. The FCC would similarly have one year after the enactment of the Act to submit a report including recommendations as to whether there is adequate licensed and unlicensed spectrum availability to support the growing IoT, what regulatory barriers may exist, and what the role of licensed and unlicensed spectrum is in the growth of the IoT.

According to the bill, estimates indicate more than 50,000,000,000 devices will be connected by the year 2020 with the IoT having the potential to generate trillions of dollars in economic opportunity.  Similarly, the IoT will allow businesses across the country to simplify logistics, cut costs, and pass savings on to consumers by utilizing the IoT.  Believing the United States leads the world in the development of technologies that support the IoT and this technology may be implemented by the U.S. Government to better deliver services to the public, the DIGIT Act was introduced following a previous Senate Resolution (Senate Resolution 110, 114th Congress) calling for a national strategy for the development of the IoT.

Dwolla Fined $100,000 by CFPB in First Data Security Enforcement Action

The Consumer Financial Protection Bureau (“CFPB”) gave the fintech online payment sector a “wake up call” with an enforcement action against a Des Moines start up digital payment provider, Dwolla, Inc. (“Dwolla”).

The CFPB alleged that Dwolla misrepresented how it was protecting consumers’ data. Dwolla entered into a Consent Order to settle the CFPB charges and agreed to pay a $100,000 penalty and to change and improve its current security practices.  The CFPB never alleged that Dwolla had breached any consumer data.  According to the CFPB, Dwolla “failed to employ reasonable and appropriate measures to protect data obtained from consumers from unauthorized access,” while telling consumers that the information was “securely encrypted and stored.”  Dwolla, had over 650,000 customer accounts and was transferring as much as $5M a day in 2015.

In a nutshell, the CFPB alleged that Dwolla’s representations regarding “securely encrypted and stored data,” were inaccurate for a number of specific reasons including:

  • Failing to implement appropriate data security policies and procedures until at least September 2012,
  • Failing to implement a written data security plan until at least October 2013,
  • Failing to conduct adequate risk assessments,
  • Failing to use encryption technology to properly safeguard consumer information,
  • Failing to provide adequate or mandatory employee training on data security, and
  • Failing to practice secure software development for consumer facing applications

In addition to the fine, Dwolla agreed to take preventative steps to address security concerns including:

  • Implementing a comprehensive data security plan,
  • Conducting data security risk assessments twice annually,
  • Designating a qualified individual to be accountable for data security issues,
  • Implementing appropriate data security policies and procedures,
  • Implementing an appropriate and precise method of customer identity authentication before any funds transfer,
  • Adopting specific procedures for the selection and retention of service providers capable of maintaining security practices,
  • Conducting regular and mandatory security data training, and
  • Obtaining an annual data security audit from an independent, third party acceptable to CFPB’s enforcement director.

The Consent Order will remain in effect for five (5) years.

This is the CFPB’s first enforcement action directly related to data security and appears to expand the CFPB’s jurisdiction into this arena. In the CFPB press release Director Richard Cordray stated, “With data breaches becoming commonplace and more consumers using these online payment systems, the risk to consumers is growing.  It is crucial that companies put systems in place to protect this information and accurately informed consumers about their data security practices.”

This virgin enforcement action by the CFPB appears to be a direct response to the growing concern about the lack of regulation for fintech digital payment firms. The enforcement action is also a welcome signal to traditional banks who have argued that the fintech sector has not received near the level of oversight or enforcement as they have.  It appears regulators are attempting to find the right balance between acting too “heavy handed” and not squelching the technical advances that have made finance more convenient for consumers while still insuring an adequate level of consumer protection.

LexBlog