Bloomberg BNA (subscription) recently reported that this fall the Center for Democracy & Technology (CDT) will be issuing a report on Fitbit Inc.’s privacy practices. Avid runners, walkers or those up on the latest gadgets likely know about Fitbit, and its line of wearable fitness devices. Others may know about Fitbit due to the need to measure progress in their employers’ wellness programs, or even whether they qualify for an incentive. When participating in those programs, employees frequently raise questions about the privacy and security of data collected under such programs, a compliance issue for employers. Earlier this month, FitBit reported that its wellness platform is HIPAA compliant.

FitBitFitBit’s Charge HR (the one I use) tracks some interesting data in addition to the number of steps: heart rate, calories burned, sleep activity, and caller ID. This and other data can be synched with a mobile app allowing users to, among other things: create a profile with more information about themselves, to track progress daily and weekly, and to find and communicate with friends also using a similar device.

Pretty cool stuff, and reasons why FitBit is the most popular manufacturer of wearables with nearly 25 percent of the market, as noted by Bloomberg BNA. But, of course, FitBit is not the only player in the market, and the same issues have to considered with the use of wearables regardless of the manufacturer.

According to Bloomberg BNA’s article, one of the concerns raised by CDT about FitBit and other wearables is that the consumer data collected by the devices may not be protected by federal health privacy laws. However, CDT’s deputy director of the Consumer Privacy Project stated to Bloomberg BNA that she has “a real sense that privacy matters” to FitBit. This is a good sign, but the laws that apply to the use of these kinds of devices depend on how they are used.

When it comes to employer-sponsored wellness programs and health plans, a range of laws may apply raising questions about what data can be collected, how it can be used and disclosed, and what security safeguards should be in place. At the federal level, the Health Insurance Portability and Accountability Act (HIPAA), the Americans with Disabilities Act (ADA), and the Genetic Information Nondiscrimination Act (GINA) should be on every employer’s list. State laws, such as California’s Confidentiality of Medical Information Act, also have to be taken into account when using these devices in an employment context.

Recently issued EEOC proposed regulations concerning wellness programs and the ADA address medical information confidentiality. If finalized in their current form, among other safeguards, the regulations would require employers to provide a notice informing employee about:

  • what medical information will be obtained,
  • who will receive the medical information,
  • how the medical information will be used,
  • the restrictions on its disclosure, and
  • the methods that will be used to prevent improper disclosure.

Preparing these notices for programs using wearables will require knowing more about the capabilities of the devices and how data is accessed, managed, disclosed and safeguarded.

But is all information collected from a wearable “medical information”? Probably not. The number of steps a person takes on a given day, in and of itself, seems unlikely to be medical information. However, data such as heart rate and other biometrics might be considered medical information subject to the confidentiality rule. Big data analytics and IoT may begin to play a greater role here, enabling more detailed pictures to be developed about employees and their activities and health through the many devices they use.

Increasingly wellness programs seek to incentivize the household, or at least employees and their spouses. Collecting data from wearables of both employee and spouse may raise issues under GINA which prohibits employers from providing incentives to obtain genetic information from employees. Genetic information includes the manifestation of disease in family members (yes, spouses are considered family members under GINA). The EEOC is currently working on proposed regulations under GINA that we are hoping will provide helpful insight into this and other issues related to GINA.

HIPAA too may apply to wearables and their collection of health-related data when related to the operation of a group health plan. Employers will need to consider the implications of this popular set of privacy and security standards including whether (i) changes are needed in the plan’s Notice of Privacy Practices, (ii) business associate agreements are needed with certain vendors, and (iii) the plan’s risk assessment and policies and procedures adequately address the security of PHI in connection with these devices.

Working through plans for the design and implementation of a typical wellness program certainly must involve privacy and security; moreso for programs that incorporate wearables. FitBits and other devices likely raise employees’ interest and desire to get involved, and can ease administration of the program, such as in regard to tracking achievement of program goals. But they raise additional privacy and security issues in an area where the law continues to develop. So, employers need to consider this carefully with their vendors and counselors, and keep a watchful eye for more regulation likely to be coming.

Until then, I need to get a few more steps in…

According to a Bloomberg article, the second phase of HIPAA audits by the Office for Civil Rights (OCR), originally set to commence in 2014, may be coming soon. This update came at a HIPAA conference co-hosted by OCR during which OCR Director Jocelyn Samuels said the agency was in the process of confirming contact information of those entities that would be audited. Reason for the delay – budgetary limitations and gaps in personnel.

Covered entities and business associates have been hearing about a second phase of HIPAA audits and a permanent OCR audit program since the OCR pilot program back in 2011 and 2012. But inaction by the agency should not delay an organization’s preparedness. Perhaps more likely than an OCR audit, a covered entity or business associate may experience a data breach affecting protected health information (PHI). Most recently, Excellus Healthcare experienced a breach affecting 10.5 million. In the case of a breach, a resulting OCR investigation/compliance review and findings of inadequate compliance with the privacy or security rules could result in far more dire consequences to the organization than what might follow an audit.

Reports about the upcoming audit program indicate some key areas of focus by the OCR. These also are areas that OCR has raised numerous times in settlements with covered entities following data breach investigations.

  • Has a risk assessment been carried out and documented?
  • Are written policies and procedures in place that address the privacy and security standards, and vulnerabilities identified in the assessment?
    • Strong “practices” are not enough – they need to be in writing.
  • Is an incident response plan in place for responding to breaches of unsecured PHI?
  • Are adequate safeguards in place for mobile devices and storage media?
    • Your doctors, nurses and staff have their own devices – do you have a BYOD policy that incorporates HIPAA issues, not just data security?
  • Is a training program in place, with documented training for new workforce members and periodically for all workforce members?
  • Is a compliant Notices of Privacy Practices provided to patients?
    • Have you checked your website lately? Many covered healthcare providers only provide hardcopies of these Notices in the office without realizing that they may need to have the Notice prominently available on the practice’s website.
  • Do you have appropriate agreements in place with business associates?

It is anticipated that most of the audits will be “desk audits.” This means that an OCR investigator will not be coming to visit you in person, but will be asking for documents. The investigator will want to see that the assessment has taken place, that the policies have been adopted, that the training has been conducted, that the notices have been delivered, etc. Operational compliance (that is, are you doing what compliant policies say you should be doing) may not always be 100%, but having the right documents in place can go a long way toward helping you survive an OCR audit, whether in connection with the long-awaited second phase of audits, or following a data breach.

Government contractors have a wide range of unique challenges (find out more about these here), not the least of which is data security. A good example is the interim rule the Department of Defense (DoD) issued last month that implements sections of the National Defense Authorization Act for Fiscal Years 2013 and 2015. In short, these provisions expand the incident reporting requirements for contractors and increase the security requirements for cloud service providers.

The Secretary of Defense determined that “urgent and compelling” reasons exist to issue the interim rule without prior opportunity for public comment. There is an urgent need to protect covered defense information and to increase awareness of the full scope of cyber incidents being committed against defense contractors. The use of cloud computing has greatly increased, according to the Secretary, and has increased the vulnerability of DoD information. The recent high-profile breaches of Federal information also influenced this determination. It is easy to see how similar considerations will influence other federal and state agencies to tighten their data security requirements on their contractors and subcontractors.

The hope here is that the rule will increase the cyber security on DoD information on contractor systems, help to mitigate risk, and gather information for the development of future improvements in cyber security. Note that the DoD will consider public comments to the interim rule before issuing the final rule. Comments must be submitted on or before October 26, 2015 to be considered.

Incident Reporting Highlights

  • Contractors and subcontractors must report cyber incidents that result in an actual or potentially adverse effect on a covered contractor information system or covered defense information residing on that system, or on a contractor’s ability to provide operationally critical support.
  • A “cyber incident” means actions taken through the use of computer networks that result in a compromise or an actual or potentially adverse effect on an information system and/or the information residing therein. A “compromise” is the disclosure of information to unauthorized persons, or a violation of the security policy of a system, in which unauthorized intentional or unintentional disclosure, modification, destruction, or loss of an object, or the copying of information to unauthorized media may have occurred.
  • Rapid reporting is required – this means 72 hours of discovery of a cyber incident.
  • The DoD recognizes that the reporting may include the contractor’s proprietary information, and will protect against the unauthorized use or release of that information.
  • The reporting of a cyber incident will not, by itself, be interpreted as evidence that the contractor or subcontractor has failed to adequately safeguard covered defense information.

Cloud Computing Highlights

  • Contracts for cloud computing services may be awarded only to providers that have been granted provisional authorization by Defense Information Systems Agency, at the appropriate level.
  • Cloud computing service providers must maintain government data within the 50 states, the District of Columbia, or outlying areas of the United States, unless physically located on DoD premises. Government data can be maintained outside the U.S. upon written notification from the contracting officer.
  • Government data means any information, document, media, or machine readable material regardless of physical form or characteristics, that is created or obtained by the government in the course of official government business.
  • Purchase requests for cloud computing service must, among other things, describe government data and the requirement for the contractor to coordinate with the responsible government official to respond to any “spillage” occurring in connection with the services. Spillage happens when a security incident results in the transfer of classified or controlled unclassified information onto an information system not authorized for the appropriate security level.

Defense contractors and their subcontractors will need to review the interim rule carefully and make adjustments. Of course, the focus here is not solely on personal identifiable information, but the same principles apply. Maintaining a well-thought out and practiced incident response plan is critical.

On September 2, the Office for Civil Rights (OCR) reported that it agreed to settle potential violations of the HIPAA privacy and security regulations with Cancer Care Group, Inc. The dollar amount of the settlement, $750,000, is significant, and the agreement to adopt a robust, multi-year corrective action plan under the watchful eye of the government is likely to be an unwanted strain on the business.

With 17 radiation oncologists, Cancer Care is by no means a mom and pop outfit, but it is also not a national provider. Small to mid-sized healthcare providers and their business associates need to take note. What started as a seemingly small theft issue – laptop bag stolen from an employee’s car – has led to nearly a million dollars in settlement and other costs, and years of government monitoring of the practice’s privacy and security compliance.

Thinking your healthcare or related business will not experience a breach may not be a wise approach. According to a KPMG report, highlighted by ConsumerAffairs, in the past two years 81 percent of hospitals and health insurance companies have had a data breach. The question really is, however, will your business be able to stand up to an OCR compliance review that comes along with the OCR’s investigation of the breach.

What happened: On August 29, 2012, OCR received notification from Cancer Care regarding a breach of unsecured electronic protected health information (ePHI) after a laptop bag was stolen from an employee’s car. The bag contained the employee’s computer and unencrypted backup media, which contained the names, addresses, dates of birth, Social Security numbers, insurance information and clinical information of approximately 55,000 current and former Cancer Care patients. A fairly typical scenario many businesses face, including health care providers, with the myriad of devices employees use every day in their jobs.

The OCR investigation: OCR claims Cancer Care was in “widespread non-compliance with the HIPAA Security Rule.” According to OCR’s press release, the provider “had not conducted an enterprise-wide risk analysis…did not have in place a written policy specific to the removal of hardware and electronic media containing ePHI into and out of its facilities, even though this was common practice within the organization.” So you see, it was not so much the theft of the laptop, but the alleged lack of safeguards and compliance that could have (even if it in fact would not have) prevented the breach from happening, that drew the agency’s ire.

OCR’s Corrective Action Plan (CAP): You can read the CAP here. Under the CAP, you’ll find that Cancer Care needs to get OCR’s approval before it can proceed with key compliance steps. For example, it needs to provide its risk assessment to OCR within 90 days of the effective date of the settlement agreement, and await OCR’s approval. A similar process applies for other components of the HIPAA security rules, including the development of a risk management plan and training program. Cancer Care must also provide an annual report to OCR for at least three years concerning updates or changes to its risk management plan, among a number of other things.

Take Away: No health care provider or other business wants to have a breach. But if it does, it will be less likely to face significant enforcement action by OCR if it has a compliance program in place – perform and document a risk assessment; address the risks of mobile devices; design and implement a quality training program. These are just a few of the steps a health care provider, health plan, business associate or other organization with HIPAA privacy and security obligations should be taking to mitigate these compliance risks.

In a recent ruling, the Seventh Circuit abandoned its previous stance as to whether a complete offer of judgment prior to the filing of a class certification motion would moot a class action brought pursuant to the Telephone Consumer Protection Act (TCPA).

In 2009, the plaintiff, Arnold Chapman, brought a class action alleging First Index Inc. had violated the TCPA when it sent “junk faxes” without the consent of the recipients.  While Chapman’s class certification motion was pending, First Index made an offer of judgment under FRCP 68 for Chapman’s entire damages.  Thereafter, Chapman did not respond.  Following Chapman’s failure to respond, and on application from First Index, the district court dismissed Chapman’s claims as moot.

The Seventh Circuit reversed the district court’s decision, holding that Chapman’s case is only moot when it is impossible for a court to grant any effectual relief whatsoever to the prevailing party.  Using this analysis, the Circuit Court ruled Chapman’s case was not moot as the district court could award damages and/or enter an injunction.  In reaching its decision, the Circuit Court acknowledged, but refused to follow and in fact, overruled, its earlier decisions, including Damasco v. Clearwire Corp., which mooted claims when a plaintiff declines an offer that would satisfy his/her entire demand. In doing so, the Circuit Court relied on the dissent by U.S. Supreme Court Justice Elena Kagan in Genesis Healthcare Corp. v. Symczyk.

The Circuit Court’s ruling, which comes as the U.S. Supreme Court considers the impact of an offer of judgment on TCPA class actions, may provide insight into how SCOTUS will ultimately decide this issue.  In fact, the Circuit Court acknowledged this point and stated it is “best to clean up the law of this circuit promptly, rather than require Chapman and others in his position to wait another year for the Supreme Court’s decision.”

While we continue to await the decision from SCOTUS, this case provides insight into how the Seventh Circuit anticipates SCOTUS will rule.  At the same time, this decision is detrimental to TCPA defendants who sought to rely on the Seventh Circuit’s prior rulings to support a claim that a case is moot after an offer of judgment for full relief to the named plaintiff.

When an employer is responding to a breach of their employees’ personal information, one of the last things they may think about is whether the value of the credit monitoring or other identity protection services they make available to affected employees should be considered taxable to the employees and reported as such. In Announcement 2015-22, the Internal Revenue Service clarified that it will not consider the value of such services provided by the employer to employees to be gross income or wages to the employees. The IRS also stated it will not take the position that the employees should include the value of such services as gross income on their personal income tax returns.

Providing identity protection services is a common step companies take to mitigate harm following a data breach and, depending on the state laws triggered, can be required. In general, Section 61 of the Internal Revenue Code describes gross income very broadly to include compensation for services including fees, commissions, fringe benefits, and similar items, and pensions. However, the IRS will not be asserting that individuals affected by a data breach must include in gross income the value of the identity protection services provided by the organization that experienced the data breach. This position likely applies to the tax treatment of such services provided to individuals by any organization following a data breach.

The IRS announcement states, however, this position will not apply to cash received in lieu of identity protection services, or to identity protection services received for reasons other than as a result of a data breach, such as identity protection services received in connection with an employee’s compensation benefit package. In those cases, the cash received or the value of the services provided likely would be taxable income. The announcement also does not affect the tax treatment of proceeds received under an identity theft insurance policy which is governed by existing law.

Note that state income taxes potentially could apply, although many states “piggy-back” on federal tax law and may follow the IRS Announcement here. Organizations and individuals should confirm with their tax advisors the tax treatment for these services at the state level.

 

When businesses set out to safeguard “personal information,” a fundamental consideration is what that term means. Likewise, when negotiating a third-party vendor agreement, it typically is not enough to rely on the standard definition for “confidential information.” Recently, Nevada and other states have updated their definitions of personal information in connection data breaches notification and safeguarding requirements. We cannot cover all of the updates here, but particularly for organizations in multiple states, it is important to ask the question and consider exactly what elements of personal information require protection. You may end up being more protective and include more data than necessary, it may be practical to do so, but you will want to know what must be protected.

The Usual Suspects

In states that have enacted data breach notification laws or affirmative obligations to protect personal information, you can count on personal information including the usual suspects: Social Security number (SSN), drivers’ license number or state identification number, and financial account numbers and payment card numbers with access codes. Why? Well, in general, these are the data elements believed to be the ones most likely used in the commission of identity theft. Note a few states, like Nevada, make clear the law does not apply to the last four digits of some of these numbers, including the SSN.

But, of course, state laws are not the only source for law on the classes of personal information that warrant protection. Depending on the nature of your business, federal and international laws can also play a significant role in shaping the definition of personal information in your policy, as can contractual obligations.

Casting a Wider Net

One of the few states with an encryption mandate, Nevada recently expanded the scope of personal information subject to that mandate. Prior to the amendment, the state law (NRS 603A.040) defined personal information as noted above: Social Security number, drivers’ license number or state identification number, and financial account numbers and payment card numbers with access codes. Massachusetts, which also has encryption mandate, uses a similar definition. With the enactment of Assembly Bill No. 179, which becomes effective July 1, 2015 (though compliance is not require until July 1, 2016), “personal information” also includes:

  • driver authorization card number;
  • a medical identification number;
  • a health insurance identification number; and
  • a user name, unique identifier or electronic mail address in combination with a password, access code or security question and answer that would permit access to an online account.

A quick survey of some of the 47 state data breach notification laws reveals, in addition to the elements above, other elements of personal information that could trigger a notification requirement in certain states, such as:

  • biometric data, such as a fingerprint, retina or iris image;
  • date of birth;
  • maiden name;
  • an identification number assigned by an employer; and
  • digitized or other electronic signature.

As noted, classifications of personal information requiring protection are not solely a function of state law.

From a consumer protection standpoint, the Federal Trade Commission takes a broad view of personal information that needs to be secured and protected. In a decision concerning whether a company adequately safeguarded customer information, the FTC defined that term to include the following elements:

  • first and last name;
  • home or other physical address;
  • e-mail address or other online contact information, such as an instant messaging user identifier or a screen name;
  • telephone number;
  • Social Security number;
  • driver’s license or other state-issued identification number;
  • financial institution account number;
  • credit or debit card information;
  • persistent identifier, such as a customer number held in a “cookie,” a static Internet Protocol (“IP”) address, a mobile device ID, or processor serial number;
  • precise geolocation data of an individual or mobile device, including GPS-based, WiFi-based, or cell-based location information;
  • an authentication credential, such as a username and password; or,
  • any other communications or content that is input into, stored on, captured with, accessed, or transmitted through a covered device, including but not limited to contacts, e-mails, text messages, photos, videos, and audio recording.

For covered entities and business associates under HIPAA, “protected health information” encompasses health information, including demographic information, about an individual (and which does or can reasonably identify the individual) that relates to the (i) past, present, or future physical or mental health or condition of an individual, (ii) the provision of health care to an individual, or (iii) the past, present, or future payment for the provision of health care to an individual.

For employers, federal statutes like the Genetic Information Nondiscrimination Act (GINA) can be a trap for the unwary. It requires genetic information be safeguarded and not disclosed, except under certain circumstances. It may seem unusual, but one example of genetic information is information about the manifestation of disease in the spouse of an employee.

If you are charged with preparing your company to be compliant with safeguarding personal information, it is worth spending some time thinking about what personal information you need to protect. This requires knowing your business, where you do business, where your employees and customers reside, who you do business with, what youe contractual obligations are, and a number of other factors. The answers may surprise you.

The saying – never let them see you sweat – soon may be more difficult to accomplish with Microsoft’s Hololens. Like Google Glass, the Hololens is worn as a headset. But this device has a “plurality” of sensors that gather a range of biometrics parameters (heart rate, perspiration, etc.) which determine along with other information if the wearer needs help with something, and then tries to provide that help. Referred to in Microsoft’s patent application approved earlier this year as an “augmented reality help system,” the device’s applications and implications can be far reaching, as it is not hard to see, for example, why companies might want to adopt this technology to benefit their business.

Consider a manufacturing or IT employee having trouble trying to install a new piece of equipment or assemble a piece of flat-pack furniture, a chore that drives some of my own biometrics parameters. Hololens may be able to help. The patent application states:

A person may experience stress that is related to a situation or current context. For example, a person may have difficulty performing a task and grow frustrated as the number of unsuccessful attempts at completing the task grows…

Experiencing stress may also inhibit clear thinking and increase the difficulty of successfully managing a task or situation. Additionally… seeking help from electronic devices would impose inconvenient burdens on the person, or may be impractical or even impossible given the person’s current context…

To address the above issues, an augmented reality help system [would] determine that the user is experiencing a stress response [and] present help content to the user via the head-mounted display device.

So, Hololens can be a valuable tool for an individual trying to overcome complicated tasks at work by using various sensors to simultaneously collect and analyze a wide range of biometric and other data points that determine whether the individual needs some help doing his or her job or a particular task. The device then provides information to the wearer through holographic images to help resolve the problem. These sensors include:

  • a heart rate monitor to measure heart rate,
  • a pulse oximeter sensor to measure hemoglobin saturation,
  • an electrodermal response sensor to monitor the skin’s electrical resistance,
  • an electroencephalographic (EEG) monitor to monitor brainwave activity, and
  • a perspiration sensor to detect sweat.

The descriptions of the device in the patent application, news outlets and reports point to various applications and uses for Hololens. A device like this might have substantial productivity benefits and one can envision lower training costs and fewer errors, among other advantages. However, like many new technologies, implementation would need to be handled carefully not only to assess whether the device will work for the application intended, but will it be worth the investment and effort given the legal and other risks. Hololens adds to the long list of technologies and devices already on the market which legislatures and courts are grappling to understand and regulate.

Privacy and data security considerations are among the many legal considerations and, of course, critical as the device collects a range of health-related data that would seem to be able to paint a detailed, albeit incomplete, picture of an individual’s physical and/or mental health condition. Would an employee realize how much data is being collected and to whom that information is made available? Labor relations is another consideration as employers would certainly have to bargain with the union before they would be able to require represented employees to use Hololens for the purposes contemplated herein. An employer also would have to consider, for example, whether the gathering of biometric and other medical data constitutes a disability-related inquiry under the Americans with Disabilities Act and how the U.S. Equal Employment Opportunity Commission (EEOC) might view that activity. Whether the rules the EEOC proposed earlier this year concerning workplace wellness programs will address wearables and perhaps shed light on the agency’s view of such devices, such as Hololens, remains to be seen.

Once the information is collected, how will it be used? Managers oversee and monitor their employees regularly. A plant manager might observe assembly line operations for workers causing delays, or that need additional help, or that simply are not performing sufficiently. Devices like Hololens would increase dramatically the information available to managers to assist in making these determinations. But will that information be the kind managers should be using, will the use of the information increase the likelihood of disparate impact claims? These are just a few of the questions that need to be considered. Assuming such data can be collected and used for certain work-related purposes, companies already face challenges safeguarding personal information. Will they be able to maintain the security of the sensitive health data captured and transmitted by these devices?

Hololens has not been released for sale yet, but there already is speculation about its release date, some are saying 2016. If true, it may not be long before someone at your company says, “Hey, we need this!” At that point, and maybe even before, businesses need to be carefully thinking through the benefits and risks of introducing this or similar devices into the workplace, or allowing employees to use them.

As we have previously discussed, the Federal Communications Commission (the “FCC”) recently issued a Declaratory Ruling (“Declaratory Ruling”) that, among other things, likely exposes companies to even greater liability under the Telephone Consumer Protection Act (the “TCPA”).

The TCPA regulates communications, from companies to their consumers, that utilize an automatic telephone dialing system (“ATDS”).  Under the TCPA, before contacting a consumer via an ATDS, a company must obtain prior express consent.  (If the communication is for “telemarketing” purposes, the company must obtain this prior consent in writing.)  TCPA lawsuits have been brought not only against predictable defendants, such as telemarketing firms and debt collectors, but also against social networking companies, sports franchises, schools and universities, pharmaceutical companies, travel and entertainment companies, retailers, and online service providers.  Companies that outsource their telemarketing services to third-party vendors, it is important to note, are not immune from TCPA liability and, in fact, may be held directly liable for their vendors’ TCPA violations.  Faced with the prospect of staggering, uncapped statutory damage liability, companies have routinely settled TCPA class actions for tens of millions of dollars.

Even in single-plaintiff cases, damages under the TCPA can accumulate in a hurry.  In a recently decided case, a U.S. District Court granted partial summary judgment in favor of a TCPA plaintiff, awarding her $229,500 in damages.  Beyond the high damages figure, the case raises concern for companies that utilize ATDS because it demonstrates the breadth of TCPA liability.  In this case, Plaintiff alleged that Defendant made over 163 automated or prerecorded calls to her mobile phone without her consent.  Defendant moved to stay trial, arguing that the Court should await interpretive guidance from the FCC on the definition of “called party” under the TCPA.  This definition is significant, Defendant argued, because, although it ultimately called Plaintiff, it had intended to call the previous owner of Plaintiff’s number – a customer who had consented to receive calls regarding his past due account balance.  The Court denied Defendant’s motion, holding that “called party” unequivocally refers to the party actually called.  Defendant’s intent, the Court held, was only relevant on the issue of willfulness.

The Court also rejected Defendant’s argument that the system it used to call Plaintiff was not an ATDS because it did not generate numbers to dial at random or in sequence, but instead made a list of customers that met certain criteria – in this instance, customers who were behind on their bills – and dialed them.  Whether Defendant’s system actually dialed Plaintiff’s number randomly, however, the Court found, was irrelevant.  Because the system had the capacity to dial numbers at random, it was an ATDS.  Period.

Defendant’s next argument – that it was only liable for the 70 calls it made that were connected – was likewise unavailing.  Defendant, the Court held, “violated the statute each time it placed a call using its ATDS without consent, regardless of whether the call was answered by a person, a machine, or not at all.”

Although it resulted in only a nominal victory for Defendant, the Court drew an important distinction in the area of consent.  Between July 3 and October 3, 2013, Defendant placed 10 calls to Plaintiff via its ATDS.  Plaintiff was not the intended recipient of these calls – the prior owner of Plaintiff’s number was.  Following the tenth call, Plaintiff informed Defendant that she had assumed ownership of the number previously held by the customer that Defendant was attempting to reach, and asked Defendant to stop calling her.  Defendant did not do so, but instead called Plaintiff an additional 153 times.  The Court found that the first 10 calls – those preceding Plaintiff’s request that Defendant cease calling her – were covered by the broad consent given to Defendant under its Service Agreement (“We may call you . . . for any purpose . . .”), and thus were not violative of the TCPA.   Once Plaintiff requested that Defendant stop calling her, however, she effectively revoked her consent, and all calls thereafter violated the TCPA.  The Court held that Defendant’s violation of the TCPA was knowing and willful because it had ignored Plaintiff’s request that it cease calling her.  The Court thus awarded Plaintiff treble damages.

Had the Court issued its decision after the Declaratory Ruling was released, it likely would have tagged Defendant with an additional nine TCPA violations.  To encourage businesses to institute new and/or better safeguards against calling reassigned numbers, the Declaratory Ruling limits companies to one call following reassignment before liability begins to accrue.  To avail itself of even this narrow safe haven, a company must have a reasonable basis for believing that its one call was consented to.

In sum, the Declaratory Ruling has opened the door to even greater liability under the TCPA.  Additionally, as we covered back in May, the U.S. Supreme Court will soon decide the fate of a valuable strategy to limit TCPA liability – offers of judgment under Rule 68 of the Federal Rules of Civil Procedure.  If the Court rules that TCPA defendants may no longer utilize this tool, the settlement leverage of TCPA plaintiffs will be dramatically enhanced, and the plaintiff’s bar will be emboldened in its search for TCPA plaintiffs.  In light of the present breadth of liability under the TPCA, and the possibility that it may soon become even more expansive, companies should strongly consider the following preventative measures, among others:

  1. Review the policies and practices of third party vendors to ensure that they are not sending communications violative of the TCPA;
  2. Either obtain written consent for all ATDS communications, or be sure to carefully delineate between telemarketing and non-telemarketing campaigns, obtaining written consent prior to sending any ATDS communication in connection with the former;
  3. Utilize consent forms that are conspicuous and easily understood, thereby mitigating the risk that the form will be deemed invalid;
  4. Maintain all consent records for at least four years (the statute of limitations period for TCPA claims);
  5. Assess the efficacy of current safeguards against calling reassigned numbers and, if necessary, improve or replace those safeguards; and
  6. Provide consumers user-friendly mechanisms– such as texting “STOP” or “UNSUBSCRIBE” – to opt-out of receiving TCPA-covered communications.

 

In June, Connecticut’s governor signed into law Senate Bill 949 which amended the State’s breach notification statute. The requirement that covered businesses must provide one year of identity theft protection services for certain breaches, easily the most popular aspect of the legislation, may have diverted attention from some significant aspects of this new law. Senate Bill 949 established expansive data security requirements for entities contracting with state agencies and entities in the health insurance and administration business (e.g., health insurance insurers, pharmacy benefits managers, and third-party administrators). See a more complete discussion of the law here, and some highlights below.

Contractors Must Implement a Data Security Program

Entities that have contracts with the state and receive “confidential information” from state agencies are required to implement and maintain a “comprehensive data-security program,” including the use of security policies, annual reviews of such policies, access restrictions, and mandatory security awareness training for employees beginning July 1, 2015.

Some of the requirements include:

  • Policies must restrict access to confidential information only to authorized employees.
  • There must be security and breach investigation procedures.
  • The data security program must be reviewed annually.
  • When applicable, contractors must provide the state Attorney General and the contracting agency a report detailing breaches or suspected breaches, including mitigation plans or why the contractor believes no breach occurred.
  • Contractors cannot store confidential information on stand-alone computers or notebooks or portable storage devices, such as USB drives. This provision has limited exceptions.
  • Contractors may not copy, reproduce, or transmit confidential information except as necessary to complete the contracted services.

Because of the way many businesses perform their services today (e.g., utilizing flash drives and allowing employees to work from home, perhaps with their own computers), the new mandates may require significant changes in current practices. Contractors that are “business associates” of a state agency as defined under HIPAA may have to do more than comply with the HIPAA privacy and security regulations, and should revisit their HIPAA policies and procedures to ensure compliance with the state mandates. The contracts themselves also could impose additional security obligations.

Health Insurance Businesses Must Step Up Data Security

Beginning October 1, 2017, any health insurer, health care center, pharmacy benefits manager, third-party administrator, utilization review company, or entity that is licensed to do health insurance business in Connecticut must implement and maintain a “comprehensive information security program to safeguard the personal information of insureds.” Examples of the safeguards the program must include are:

  • secure computer and Internet user authorization protocols;
  • secure access control measures that include, but are not limited to, restriction of access to personal information only to those who require such data to perform their job duties, passwords that are not default passwords and are reset at least every six months, encryption of all personal information while being transmitted on a public Internet network or wirelessly, encryption of all personal information stored on a laptop computer or other portable device, and monitoring of company security systems for breaches of security;
  • designation of one or more employees to oversee the security program;
  • identification and assessment of reasonably foreseeable internal and external risks to the security of the personal information; and
  • annual review of the scope of the secure access control measures.

Many of these entities either are covered entities or business associates under HIPAA. They should take note, however, that some of these new requirements could go beyond basic HIPAA regulatory mandates. For example, the Connecticut law requires passwords be changed at least every six months. The Connecticut law also requires encryption of all personal information while being transmitted on a public Internet network or wirelessly and when stored on a laptop or other portable device. Beginning October 1, 2017, covered health insurance businesses must certify annually to the Insurance Department, under penalty of perjury, that they maintain a comprehensive information security program that complies with the law’s requirements.

Implications

Businesses covered by the new requirements must take stock of their current operations, policies, and procedures to determine whether they are in compliance. The law also has implications beyond the businesses to which it applies directly. Consider professional service providers working with covered state contractors or health insurance businesses. Their services might involve the need to access the same confidential information triggering these requirements. These and similarly situated businesses will need to be prepared.

Getting compliant will take time and only after careful assessment and analysis. Turning this task over entirely to the company’s “IT guy” is likely not the best approach. The role of IT is no doubt critical, but these mandates require consideration of administrative and physical safeguards, as well as technical safeguards. They envision careful assignment of access to personal data based on particular need. They seek broad awareness of the safeguards throughout an organization that is accomplished through training and other measures. They mandate incident response planning, a function involving key decision makers in an organization so they know what to expect and their responsibilities in the event of a breach. They require organizations to obligate their third-party service providers to adhere to similar standards. In short, they contemplate a wholesale, enterprise-wide, regularly reviewed approach to securing confidential information that changes and develops with the organization.