Last month, Illinois Governor Bruce Rauner signed into law a number of amendments to the State’s Personal Information Protection Act (“PIPA”) that expand the definition of protected personal information and increase certain data breach notification requirements.  The amendments, highlighted below, take effect January 1, 2017.

Currently, “personal information” is limited to an individual’s first name or first initial and last name in combination with the individual’s Social Security number; driver’s license number or state identification card number; or account number or credit or debit card number, or an account number or credit card number in combination with any required security code, access code, or password that would permit access to an individual’s financial account.

The amendments now expand the definition of “personal information” to include medical information, health insurance information, or unique biometric data. Importantly, beginning in January, PIPA will require entities that suffer a security breach to inform Illinois residents of the security breach even if the personal information was encrypted or redacted but the password/keys to unencrypt or underact that information is also acquired through the breach.

In addition, “personal information” will now include a user name or email address, in combination with a password or security question and answer that would permit access to an online account, when either the user name or email address or password or security question and answer are not encrypted or redacted.

Under the new provisions, if notice is required and the breach of security involved an individual’s user name or email address, the notice is required to direct individuals to promptly change their user name or password and security question or answer, as applicable, or to take other steps appropriate to protect all online account for which the individual uses the same user name or email address and password or security question and answer.

An entity in possession of personal information will be required to implement and maintain reasonable security measures to protect the records from unauthorized access, destruction, or disclosure. Any entity that is in compliance with Section 501(b) of the Gramm-Leach-Bliley Act will be deemed in compliance with this provision.  Similarly, a HIPAA covered entity or business associates subject to the privacy and security standards will also be deemed to be in compliance with PIPA.  A covered entity or business associate that is required to provide notification of a breach to the Secretary of Health and Human Services under the HITECH Act must also provide such notification to the Illinois Attorney General.

As states continue to expand their breach notification statutes, compliance will continue to become more and more difficult.

In the face of seemingly daily news reports of company data breaches and the mounting legislative concern and efforts on both the state and federal level to enact laws safeguarding personal information maintained by companies, employers should be questioning whether they should implement privacy policies to address the protection of personal information they maintain on their employees.

To date, there is no all-encompassing federal privacy law. Rather, there are several federal laws which touch upon an aspect of protecting personal or private information collected from individual, such as the Children’s Online Privacy Protection Act (giving parents control over the information collected from their children online); Federal Trade Commission Act (pursuant to which the FTC has sought enforcement against companies who failed to follow their own privacy policies relating to consumers); Gramm-Leach-Bliley Act (requiring financial institutions, such as banks, to protect consumer financial information); Health Insurance Portability and Accountability Act of 1996 (requiring covered entities to protect individually identifiable health information); and the Americans with Disabilities Act and Family and Medical Leave Act (requiring confidentiality of employee medical information obtained by employer).

State legislatures have likewise used a piecemeal approach at attacking the problem by some states mandating the protection of social security numbers, protecting credit card information, protecting consumer financial information, and securing personally identifiable information (usually aimed at preventing identity theft). Additionally, forty seven (47) states now have laws addressing notification and other requirements when a data breach occurs. While only a handful of states explicitly require a written privacy policy (such as Connecticut when collecting social security numbers and Massachusetts in connection with a written information security program), the overwhelming majority of states inexplicitly require privacy policies by requiring security of personal information (such as California which now requires encryption) and notification when a breach of personal information has occurred. As such, where companies are required to notify affected individuals of a breach, they are implicitly required to protect the information to prevent such a breach. The first step in assembling that protection armor is to institute a privacy policy.

Employers maintain various types of personally identifiable information on their employees, including, but not limited to: names, dates of birth, social security numbers, addresses, telephone numbers, financial information (such as bank account numbers and credit/debit card numbers), email addresses and passwords, driver’s license, state issued identification and passport numbers, health insurance number, biometric data, and personally identifiable information on an employee’s spouse and/or children (most commonly contained in benefit enrollment forms), and any other information maintained about an individual that could be used to identify him/her or obtain access to an online account.

Employer privacy policies should at a minimum address: (1) the types of personal information, (such as that listed above), whether in electronic or paper format, obtained and maintained regarding employees and their family members; (2) where the information is maintained/stored; (3) how the information is protected both while being maintained and also when being transferred from the employee to the employer, between the employer’s systems/departments, and outside of the employer’s organization (such as to a third party vendor); (4) who has access to the information, including any outside vendors who perform personnel-related services for the employer; (5) the effective date of the policy; and (6) identify the individual within the organization responsible for compliance with the policy.

Additionally, employers should consider training their employees on the policy. Employees who handle private information in the course of their employment should be trained on the contents of the policy; importance of maintaining the privacy of the information; methods to be used to achieve the protection of such information; limiting disclosure of the information within the duties performed by the employee with respect to use of the information; and what to do when a suspected breach of the information has occurred. The general employee population should also be trained on the contents of the policy; the importance of maintaining the privacy of the information; and what to do if the employee suspects or has knowledge that the information has been breached.

Bloomberg BNA (subscription) recently reported that this fall the Center for Democracy & Technology (CDT) will be issuing a report on Fitbit Inc.’s privacy practices. Avid runners, walkers or those up on the latest gadgets likely know about Fitbit, and its line of wearable fitness devices. Others may know about Fitbit due to the need to measure progress in their employers’ wellness programs, or even whether they qualify for an incentive. When participating in those programs, employees frequently raise questions about the privacy and security of data collected under such programs, a compliance issue for employers. Earlier this month, FitBit reported that its wellness platform is HIPAA compliant.

FitBitFitBit’s Charge HR (the one I use) tracks some interesting data in addition to the number of steps: heart rate, calories burned, sleep activity, and caller ID. This and other data can be synched with a mobile app allowing users to, among other things: create a profile with more information about themselves, to track progress daily and weekly, and to find and communicate with friends also using a similar device.

Pretty cool stuff, and reasons why FitBit is the most popular manufacturer of wearables with nearly 25 percent of the market, as noted by Bloomberg BNA. But, of course, FitBit is not the only player in the market, and the same issues have to considered with the use of wearables regardless of the manufacturer.

According to Bloomberg BNA’s article, one of the concerns raised by CDT about FitBit and other wearables is that the consumer data collected by the devices may not be protected by federal health privacy laws. However, CDT’s deputy director of the Consumer Privacy Project stated to Bloomberg BNA that she has “a real sense that privacy matters” to FitBit. This is a good sign, but the laws that apply to the use of these kinds of devices depend on how they are used.

When it comes to employer-sponsored wellness programs and health plans, a range of laws may apply raising questions about what data can be collected, how it can be used and disclosed, and what security safeguards should be in place. At the federal level, the Health Insurance Portability and Accountability Act (HIPAA), the Americans with Disabilities Act (ADA), and the Genetic Information Nondiscrimination Act (GINA) should be on every employer’s list. State laws, such as California’s Confidentiality of Medical Information Act, also have to be taken into account when using these devices in an employment context.

Recently issued EEOC proposed regulations concerning wellness programs and the ADA address medical information confidentiality. If finalized in their current form, among other safeguards, the regulations would require employers to provide a notice informing employee about:

  • what medical information will be obtained,
  • who will receive the medical information,
  • how the medical information will be used,
  • the restrictions on its disclosure, and
  • the methods that will be used to prevent improper disclosure.

Preparing these notices for programs using wearables will require knowing more about the capabilities of the devices and how data is accessed, managed, disclosed and safeguarded.

But is all information collected from a wearable “medical information”? Probably not. The number of steps a person takes on a given day, in and of itself, seems unlikely to be medical information. However, data such as heart rate and other biometrics might be considered medical information subject to the confidentiality rule. Big data analytics and IoT may begin to play a greater role here, enabling more detailed pictures to be developed about employees and their activities and health through the many devices they use.

Increasingly wellness programs seek to incentivize the household, or at least employees and their spouses. Collecting data from wearables of both employee and spouse may raise issues under GINA which prohibits employers from providing incentives to obtain genetic information from employees. Genetic information includes the manifestation of disease in family members (yes, spouses are considered family members under GINA). The EEOC is currently working on proposed regulations under GINA that we are hoping will provide helpful insight into this and other issues related to GINA.

HIPAA too may apply to wearables and their collection of health-related data when related to the operation of a group health plan. Employers will need to consider the implications of this popular set of privacy and security standards including whether (i) changes are needed in the plan’s Notice of Privacy Practices, (ii) business associate agreements are needed with certain vendors, and (iii) the plan’s risk assessment and policies and procedures adequately address the security of PHI in connection with these devices.

Working through plans for the design and implementation of a typical wellness program certainly must involve privacy and security; moreso for programs that incorporate wearables. FitBits and other devices likely raise employees’ interest and desire to get involved, and can ease administration of the program, such as in regard to tracking achievement of program goals. But they raise additional privacy and security issues in an area where the law continues to develop. So, employers need to consider this carefully with their vendors and counselors, and keep a watchful eye for more regulation likely to be coming.

Until then, I need to get a few more steps in…

When businesses set out to safeguard “personal information,” a fundamental consideration is what that term means. Likewise, when negotiating a third-party vendor agreement, it typically is not enough to rely on the standard definition for “confidential information.” Recently, Nevada and other states have updated their definitions of personal information in connection data breaches notification and safeguarding requirements. We cannot cover all of the updates here, but particularly for organizations in multiple states, it is important to ask the question and consider exactly what elements of personal information require protection. You may end up being more protective and include more data than necessary, it may be practical to do so, but you will want to know what must be protected.

The Usual Suspects

In states that have enacted data breach notification laws or affirmative obligations to protect personal information, you can count on personal information including the usual suspects: Social Security number (SSN), drivers’ license number or state identification number, and financial account numbers and payment card numbers with access codes. Why? Well, in general, these are the data elements believed to be the ones most likely used in the commission of identity theft. Note a few states, like Nevada, make clear the law does not apply to the last four digits of some of these numbers, including the SSN.

But, of course, state laws are not the only source for law on the classes of personal information that warrant protection. Depending on the nature of your business, federal and international laws can also play a significant role in shaping the definition of personal information in your policy, as can contractual obligations.

Casting a Wider Net

One of the few states with an encryption mandate, Nevada recently expanded the scope of personal information subject to that mandate. Prior to the amendment, the state law (NRS 603A.040) defined personal information as noted above: Social Security number, drivers’ license number or state identification number, and financial account numbers and payment card numbers with access codes. Massachusetts, which also has encryption mandate, uses a similar definition. With the enactment of Assembly Bill No. 179, which becomes effective July 1, 2015 (though compliance is not require until July 1, 2016), “personal information” also includes:

  • driver authorization card number;
  • a medical identification number;
  • a health insurance identification number; and
  • a user name, unique identifier or electronic mail address in combination with a password, access code or security question and answer that would permit access to an online account.

A quick survey of some of the 47 state data breach notification laws reveals, in addition to the elements above, other elements of personal information that could trigger a notification requirement in certain states, such as:

  • biometric data, such as a fingerprint, retina or iris image;
  • date of birth;
  • maiden name;
  • an identification number assigned by an employer; and
  • digitized or other electronic signature.

As noted, classifications of personal information requiring protection are not solely a function of state law.

From a consumer protection standpoint, the Federal Trade Commission takes a broad view of personal information that needs to be secured and protected. In a decision concerning whether a company adequately safeguarded customer information, the FTC defined that term to include the following elements:

  • first and last name;
  • home or other physical address;
  • e-mail address or other online contact information, such as an instant messaging user identifier or a screen name;
  • telephone number;
  • Social Security number;
  • driver’s license or other state-issued identification number;
  • financial institution account number;
  • credit or debit card information;
  • persistent identifier, such as a customer number held in a “cookie,” a static Internet Protocol (“IP”) address, a mobile device ID, or processor serial number;
  • precise geolocation data of an individual or mobile device, including GPS-based, WiFi-based, or cell-based location information;
  • an authentication credential, such as a username and password; or,
  • any other communications or content that is input into, stored on, captured with, accessed, or transmitted through a covered device, including but not limited to contacts, e-mails, text messages, photos, videos, and audio recording.

For covered entities and business associates under HIPAA, “protected health information” encompasses health information, including demographic information, about an individual (and which does or can reasonably identify the individual) that relates to the (i) past, present, or future physical or mental health or condition of an individual, (ii) the provision of health care to an individual, or (iii) the past, present, or future payment for the provision of health care to an individual.

For employers, federal statutes like the Genetic Information Nondiscrimination Act (GINA) can be a trap for the unwary. It requires genetic information be safeguarded and not disclosed, except under certain circumstances. It may seem unusual, but one example of genetic information is information about the manifestation of disease in the spouse of an employee.

If you are charged with preparing your company to be compliant with safeguarding personal information, it is worth spending some time thinking about what personal information you need to protect. This requires knowing your business, where you do business, where your employees and customers reside, who you do business with, what youe contractual obligations are, and a number of other factors. The answers may surprise you.

The saying – never let them see you sweat – soon may be more difficult to accomplish with Microsoft’s Hololens. Like Google Glass, the Hololens is worn as a headset. But this device has a “plurality” of sensors that gather a range of biometrics parameters (heart rate, perspiration, etc.) which determine along with other information if the wearer needs help with something, and then tries to provide that help. Referred to in Microsoft’s patent application approved earlier this year as an “augmented reality help system,” the device’s applications and implications can be far reaching, as it is not hard to see, for example, why companies might want to adopt this technology to benefit their business.

Consider a manufacturing or IT employee having trouble trying to install a new piece of equipment or assemble a piece of flat-pack furniture, a chore that drives some of my own biometrics parameters. Hololens may be able to help. The patent application states:

A person may experience stress that is related to a situation or current context. For example, a person may have difficulty performing a task and grow frustrated as the number of unsuccessful attempts at completing the task grows…

Experiencing stress may also inhibit clear thinking and increase the difficulty of successfully managing a task or situation. Additionally… seeking help from electronic devices would impose inconvenient burdens on the person, or may be impractical or even impossible given the person’s current context…

To address the above issues, an augmented reality help system [would] determine that the user is experiencing a stress response [and] present help content to the user via the head-mounted display device.

So, Hololens can be a valuable tool for an individual trying to overcome complicated tasks at work by using various sensors to simultaneously collect and analyze a wide range of biometric and other data points that determine whether the individual needs some help doing his or her job or a particular task. The device then provides information to the wearer through holographic images to help resolve the problem. These sensors include:

  • a heart rate monitor to measure heart rate,
  • a pulse oximeter sensor to measure hemoglobin saturation,
  • an electrodermal response sensor to monitor the skin’s electrical resistance,
  • an electroencephalographic (EEG) monitor to monitor brainwave activity, and
  • a perspiration sensor to detect sweat.

The descriptions of the device in the patent application, news outlets and reports point to various applications and uses for Hololens. A device like this might have substantial productivity benefits and one can envision lower training costs and fewer errors, among other advantages. However, like many new technologies, implementation would need to be handled carefully not only to assess whether the device will work for the application intended, but will it be worth the investment and effort given the legal and other risks. Hololens adds to the long list of technologies and devices already on the market which legislatures and courts are grappling to understand and regulate.

Privacy and data security considerations are among the many legal considerations and, of course, critical as the device collects a range of health-related data that would seem to be able to paint a detailed, albeit incomplete, picture of an individual’s physical and/or mental health condition. Would an employee realize how much data is being collected and to whom that information is made available? Labor relations is another consideration as employers would certainly have to bargain with the union before they would be able to require represented employees to use Hololens for the purposes contemplated herein. An employer also would have to consider, for example, whether the gathering of biometric and other medical data constitutes a disability-related inquiry under the Americans with Disabilities Act and how the U.S. Equal Employment Opportunity Commission (EEOC) might view that activity. Whether the rules the EEOC proposed earlier this year concerning workplace wellness programs will address wearables and perhaps shed light on the agency’s view of such devices, such as Hololens, remains to be seen.

Once the information is collected, how will it be used? Managers oversee and monitor their employees regularly. A plant manager might observe assembly line operations for workers causing delays, or that need additional help, or that simply are not performing sufficiently. Devices like Hololens would increase dramatically the information available to managers to assist in making these determinations. But will that information be the kind managers should be using, will the use of the information increase the likelihood of disparate impact claims? These are just a few of the questions that need to be considered. Assuming such data can be collected and used for certain work-related purposes, companies already face challenges safeguarding personal information. Will they be able to maintain the security of the sensitive health data captured and transmitted by these devices?

Hololens has not been released for sale yet, but there already is speculation about its release date, some are saying 2016. If true, it may not be long before someone at your company says, “Hey, we need this!” At that point, and maybe even before, businesses need to be carefully thinking through the benefits and risks of introducing this or similar devices into the workplace, or allowing employees to use them.

Reacting to a report that identity theft was a top concern for Illinois residents (second in a list of ten), Attorney General Lisa Madigan announced a legislative proposal to strengthen the state’s existing data breach notification law. The call for stronger breach notification laws is a trend that has emerged in other states, such as New York and Indiana, and one that has had results. Florida and California are good examples. As summarized below, AG Madigan’s proposal follows a similar pattern – add provisions that require notification to the state Attorney General, expand the definition of personal information that would trigger a notification requirement, and require reasonable safeguards to protect personal information before a breach happens. It is this last point to which companies should pay particular attention. In a state Attorney General investigation following a breach, it will be those safeguards that are examined.

Attorney General Madigan has been active in the area of identity theft, maintaining an Identity Theft Unit and Hotline that provides one-on-one assistance to victims of identity theft and data breaches. She also has testified before the U.S. Senate and the U.S. House of Representatives in recent years concerning data breaches, including her testimony last month in connection with federal data breach law being debated. She is now proposing significant changes to the law originally passed in 2005, Personal Information Protection Act (PIPA). The changes include:

  • Expanding the types of personal information that could trigger a notification requirement to include medical information, biometric data, geolocation information, sensitive consumer marketing data, contact information when combined with identifying information, and login credentials for online accounts;
  • Requiring that the Attorney General’s office be notified in the event of a breach; and
  • Mandating that businesses take “reasonable” steps to protect the personal information covered by the law.

The substantial changes made to the Florida breach notification law last year also added a requirement for businesses to adopt and implement reasonable safeguards to protect personal information. Similar requirements exist in states such as Connecticut, California, Maryland, and Oregon. The most popular and most stringent of these state laws is the one in Massachusetts. Becoming effective almost 5 years ago to the day, March 1, 2010, the Massachusetts data security regulations flesh out one approach to providing reasonable safeguards. (Checklist available here).

Planning for a data breach is critical, but that should be part of an overall plan to safeguard personal information. If the trend of enhancements to data breach notification and safeguarding laws continues, it will not be long before most states have a statutory obligation to safeguard personal information through a set of written policies and procedures, just as 47 states today mandate notification in the event of a breach.

Over the past few months, many businesses, particularly in the Northeast Region, have been focusing on creating a written information security program (WISP) to comply with Massachusetts identity theft regulations that went into effect March 1, 2010. For many, this has been a significant effort, reaching most, if not all, parts of their organizations. However, it is important to remember that although Massachusetts may be the state with the most comprehensive set of rules for securing personal data, other states have enacted similar protections, and compliance with Massachusetts does NOT necessarily mean compliance with other states.

Consider the following examples:

California. The Civil Code in California states a business that owns or licenses personal information about a California resident must:

implement and maintain reasonable security procedures and practices appropriate to the nature of the information, to protect the personal information from unauthorized access, destruction, use, modification, or disclosure.

For purposes of this requirement, “personal information" means:

an individual’s first name or first initial and his or her last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted or redacted:
(A) Social security number.
(B) Driver’s license number or California identification card number.
(C) Account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account.
(D) Medical information.

Similar pretections for medical information exist in Arkansas, but that information is not covered by the rules in Massachusetts. Illinois requires safeguards for certain biometric information, a classification of data also not covered by the Massachusetts regulations.

Oregon. Oregon’s Consumer Identity Theft Protection Act lays out safeguards similar to those in Massachusetts, with some relief for small businesses (those manufacturing businesses with 200 employees or fewer and all other forms of business having 50 employees or fewer). Key is the requirement to implement an “information security program” that contains administrative, technical and physical safeguards.

Administrative safeguards include, for example: 

  1. designating one or more employees to coordinate the program;
  2. identifying reasonably foreseeable internal and external risks;
  3. assessing the sufficiency of data safeguards;
  4. training employees in the program’s practices and procedures;
  5. limiting outside service providers to those maintaining adequate data security safeguards; and
  6. adjusting the program according to business changes or new circumstances.

In New Jersey, regulations are pending that would create similar obligations.

Connecticut. Without specifying the kinds of safeguards, Connecticut requires any person in possession of personal information of another person to:

safeguard the data, computer files and documents containing the information from misuse by third parties, and [ ] destroy, erase or make unreadable such data, computer files and documents prior to disposal.

For purposes of this law, “personal information” includes:

information capable of being associated with a particular individual through one or more identifiers, including, but not limited to, a Social Security number, a driver’s license number, a state identification card number, an account number, a credit or debit card number, a passport number, an alien registration number or a health insurance identification number.

Similar requirements were enacted in other states, including Arkansas, North Carolina, Rhode Island, Texas, and Utah. But note the definition in Connecticut goes beyond the elements of data protected under the Massachusetts regulations.

Service contracts. Some states go a step further, requiring certain provisions be included in contracts between entities and their service providers when the contracts involve the disclosure of a state resident’s personal information from the owner of the information to the service provider. For example, such contracts in Nevada and Maryland must include a provision requiring the person to whom the information is disclosed to implement safeguards to protect that information.

The emergence of state mandates fueled by the continued rapid advancement and increased use of technology suggest a trend that is sure to become a fact of life for businesses operating anywhere in the U.S. Whether the technology is “cloud computing” or “peer-to-peer” software, businesses need to take appropriate steps to protect personal information maintained throughout their organizations.