The SEC Signals Heightened Attention to Cybersecurity and Public Disclosure Requirements

Through its actions and publications, the Security and Exchange Commission (SEC) has shown an increased focus on cybersecurity and the public disclosure of cybersecurity risks and incidents. In early 2018, the SEC issued a statement and an interpretative guide to assist companies with understanding and carrying out the agency’s disclosure obligations concerning cybersecurity risks and incidents. In the accompanying statement, the SEC explained “the scope and severity of risks that cyber threats present have increased dramatically, and constant vigilance is required to protect against intrusions.”

This SEC guidance follows a guide released by the SEC Division of Corporation Finance in 2011. The interpretative guide outlines the SEC’s view on cybersecurity disclosures as required under federal law. It also touches on the importance of public companies maintaining cybersecurity policies and procedures and discusses prohibited insider trader activities related to cybersecurity breaches.

The interpretive guide essentially puts public companies on notice regarding disclosure requirements for material cybersecurity risks and incidents. It explains that some reports required under the Securities Act and Exchange Act may prompt disclosure of cybersecurity risks facing a company as they relate to financial, legal, or reputational consequences. Importantly, the guide cautions that disclosures should be “timely” and warns that ongoing investigations, by themselves, do not provide a basis for avoiding the disclosure of a material cybersecurity incident.

Signaling an emphasis on enforcement actions, SEC chairman Jay Clayton warned “issuers and other market participants must take their periodic and current disclosure obligations regarding cybersecurity risks seriously, and failure to do so may result in an enforcement action.”

True to its words, after releasing the interpretative guide, the SEC brought multiple enforcement actions over cybersecurity disclosures. See SEC Enforcement Actions. Many of these actions have resulted in settlements with fines ranging in the millions, coupled with agreements by companies to improve their cybersecurity policies and procedures. The SEC appears to be focused on companies that, in the agency’s view, have made misleading statements or omissions pertaining to a cybersecurity breach and failed to properly assess whether the breach should have been incorporated into its public disclosures.

Moreover, in its strategic plan for 2018-2022, the SEC highlighted an expanded focus on cybersecurity and data protection to address the agency’s belief that “cybersecurity threats to the complex system that helps the markets function are constant and growing in scale and sophistication.” As one of the goals outlined, the SEC stated its intention to examine strategies to address cybersecurity risks facing capital markets.

These collective efforts likely foreshadow greater SEC involvement in cybersecurity and disclosure requirements. Going forward, companies must be sure that they have a cybersecurity policy and plan in place and must quickly evaluate if a cybersecurity incident requires public disclosure.

A Trio of OCR HIPAA Breach Resolutions: Is Your Organization HIPAA Compliant?

Over the past thirty days, the Office for Civil Rights (“OCR”) has reached three HIPAA breach resolutions, signaling to organizations that are covered entities and business associates under HIPAA, the importance of instituting basic best practices for data breach prevention and response.

On November 26th, the OCR announced a settlement with Allergy Associations of Hartford, P.C. (Allergy Associations), a health practice specializing in allergies, due to alleged HIPAA violations resulting from a doctor’s disclosure of patient information to a reporter. A doctor from Allergy Associations was questioned by a local television station regarding a dispute with a patient, and disclosed the patients’ protected health information (PHI), the investigation found. The OCR concluded that such disclosure was a “reckless disregard for the patient’s privacy rights”. Allergy Associations agreed to a monetary settlement of $125,000 and corrective action plan that includes two years of monitoring HIPAA compliance.

» A well thought out media relations plan together with regular security and awareness training, even for doctors, would go a long way toward reducing these risks.

Again on December 4th, the OCR announced that it had reached a settlement with the physician group, Advanced Care Hospitalists PL (ACH) in Florida, over alleged HIPAA violations resulting from the sharing of protected health information (PHI) with a vendor. According to OCR’s announcement, ACH engaged an unnamed individual to provide medical billing services without first entering into a business associate agreement (BAA). While it appeared the individual worked for Doctor’s First Choice Billing (“First Choice”), First Choice had no such record of this individual or his activities. ACH later became aware that the patient’s PHI was visible on First Choice’s website, with nearly 9,000 patients’ PHI potentially vulnerable. In the settlement ACH did not admit liability, but agreed to adopt a robust corrective action plan including the adoption of business associate agreements, a complete enterprise-wide risk analysis, and comprehensive policies and procedures to comply with the HIPAA rules. In addition ACH agreed to a $500,000 payment to the OCR.

» This is not the first time the OCR has reached settlements with covered entities over not having business associate agreements in place. Covered entities should consider a more formal vendor assessment and management. That is, certainly make sure there is a BAA in place, but also assess the business associate’s policies, procedures, and practices.

And finally, on December 11th, the OCR announced a settlement with Pagosa Springs Medical Center (PSMC), a critical access hospital in Colorado, for potential HIPAA privacy and security violations. The settlement is in response to a complaint that a former employee of PSMC continued to have remote access to the hospital’s scheduling calendar which included patients’ electronic protected health information (ePHI), after termination of his employment relationship. OCR’s investigation revealed that PSMC did not have a business associate agreement in place with its web-based scheduling calendar vendor, or with the former employee. PSMC agreed to implement a two-year corrective action plan which includes updates to its security management and business associate agreement, policies and procedures, and workforce training. In addition, PSMC agreed to an $111,400 payment to the OCR.

“It’s common sense that former employees should immediately lose access to protected patient information upon their separation from employment,” said OCR Director Roger Severino.  “This case underscores the need for covered entities to always be aware of who has access to their ePHI and who doesn’t.”

»This is a lesson for all businesses – when employees leave the organization (or are moved from a position that permits access to certain protected information), immediate changes should be made to their access – this includes physical and electronic access.

This series of recent settlements serves as a reminder of the seriousness in which the OCR treats HIPAA violations. In October, in honor of National Cybersecurity Awareness Month, the OCR together with the Office of the National Coordinator for Health Information Technology jointly launched an updated HIPAA Security Risk Assessment (SRA) Tool to help covered entities and business associates comply with the HIPAA Security Rule. This is an excellent tool to help organizations conduct an enterprise-wide risk analysis. Alternatively, our HIPAA Ready product provides a scaled approach for midsized and smaller healthcare practices and business associates. In the end, healthcare organizations and their business associates need to address basic best practices including: terminating employee access in a timely manner, maintaining proper business associate agreements, and having a plan for media relations.

The Data Care Act of 2018

A new bill in the Senate proposes to hold large tech companies, specifically “online service providers”, responsible for the protection of personal information in the same way banks, lawyers and hospitals are held responsible. The Data Care Act of 2018, which was introduced on December 12, 2018, is designed to protect users information online and penalize companies that do not properly safeguard such data.

Personal data under the bill includes:

  • Social Security number,
  • Driver’s license number,
  • Passport or military identification number
  • Financial account number, credit or debit card number with the access code or password necessary to permit access to the financial account
  • Unique biometric data, including a fingerprint, voice print, retina image or other unique physical representation
  • Account information such as user name and password or email address and password
  • First and last name of an individual or first initial and last name, in combination with data of birth.

The bill would also protect personal information from being sold or disclosed unless the end user agrees.

The bill is seen as part of a broader push to enact federal privacy legislation, in part to prevent more states from enacting their own privacy legislation, similar to recent moves in California and Illinois.

The bill was introduced by Senator Brian Schatz (D-HI), the Ranking Member of the Communications, Technology, Innovation, and the Internet Subcommittee. The bill was co-sponsored by 14 Senate Democrats.

Senator Schatz stated in a press release that people “have a basic expectation that the personal information they provide to websites and apps is well-protected and won’t be used against them. Just as doctors and lawyers are expected to protect and responsibly use the personal data they hold, online companies should be required to do the same.”

The bill would be defined and enforced by the Federal Trade Commission. It would establish three basic duties, including the duty of care, the duty of loyalty and the duty of confidentiality. If passed, the FTC would go through the normal notice and comment rulemaking process to further establish how authorities will define, implement and enforce concepts like “reasonable” security measures.

There have been no shortage of federal initiatives seeking heightened protection for consumer personal data in the past couple of years, in particular since enactment of the EU’s GDPR, and it’s only a matter of time before one of them finally sticks. We will continue to report on the Data Care Act of 2018 and other similar initiatives as developments unfold.

ONC and OCR Update HIPAA Security Risk Assessment Tool for National Cyber Security Awareness Month

October 2018 marks the 15th annual National Cyber Security Awareness Month. In honor of this occasion, the Office of the National Coordinator for Health Information Technology (ONC) and the HHS Office for Civil Rights (OCR) have jointly launched an updated HIPAA Security Risk Assessment (SRA) Tool to help covered entities and business associates comply with the HIPAA Security Rule. But remember, the HIPAA Security Rule does not require a “one-size-fits-all” approach to security.

Under the HIPAA Security Rule, a covered entity or business associate must “[c]onduct an accurate and thorough assessment of the potential risks and vulnerabilities to the confidentiality, integrity, and availability of electronic protected health information [e-PHI] held by the covered entity or business associate.” See 45 CFR § 164.308(a)(1)(ii). Failing to conduct a risk assessment can become a basis for significant monetary exposure to the OCR, such as this $750,000 settlement by a covered health care provider with OCR.

“An enterprise-wide risk analysis is not only a requirement of the HIPAA Security Rule, it is also an important process to help healthcare organizations understand their security posture to prevent costly data breaches,” stated ONC and OCR in their joint news release on the updated SRA Tool. True. Healthcare and non-healthcare organizations are increasingly seeing a similar risk assessment requirement under a growing body of state law, such as in California, Colorado, Massachusetts, New York, and Oregon.

Recognizing that conducting this enterprise-wide risk analysis can be a challenging task, the ONC and OCR developed a downloadable SRA Tool in 2014 to help covered entities and business associates identify risks and vulnerabilities to e-PHI. According to ONC and OCR, the October 2018 update to the SRA Tool improves usability and expands its application to a broader range of health data security risks. Still, the SRA Tool may not be the right fit for small and midsized covered entities and business associates. In fact the HIPAA Security Rule contemplates that covered entities and business associates may use any security measures that reasonably and appropriately implement the standards and implementation specifications. In doing so, they may take into account certain factors about their organization: (i) size, complexity, and capabilities, (ii) technical infrastructure, hardware, and software security capabilities, (iii) costs of security measures, and (iv) probability and criticality of potential risks to electronic protected health information.

Use of the SRA Tool is not required by the HIPAA Security Rule, and its use alone does not mean that an organization is compliant with the HIPAA Security Rule or other federal, state or local laws and regulations. However, it may help organizations in their efforts to comply with the HIPAA Security Rule requirement to conduct periodic security risk assessments. Notably, while the SRA Tool may provide a basic outline for the risk assessment process, it does not provide substantive legal guidance as to how a covered entity or business associate is to navigate between the various standards that are either “required” or simply “addressable.” While completing a risk assessment is a requirement under HIPAA, organizations should seek guidance from legal counsel as to how to complete such an assessment and how to develop and implement appropriate safeguards based on the results of the assessment. Failing to do so could create significant liability for your organization.

Failing to conduct regular risk assessments could not only lead to a healthcare data breach, but it could also result in a covered entity or business associate being fined by the OCR. To learn more about how the firm can assist healthcare organizations with HIPAA compliance and data security, please contact your Jackson Lewis attorney.

California Consumer Privacy Act Amendment Signed Into Law

On September 23, 2018, Governor Jerry Brown signed into law SB-1121 amending certain provisions of the California Consumer Privacy Act of 2018 (CCPA) which was enacted in June of this year. As we reported previously, CCPA will apply to any entity that does business in the State of California and satisfies one or more of the following: (i) annual gross revenue in excess of $25 million, (ii) alone or in combination, annually buys, receives for the business’ commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices, or (iii) derives 50 percent or more of its annual revenues from selling consumers’ personal information. Under CCPA, key consumer rights will include:

  • A consumer’s right to request deletion of personal information which would require the business to delete information upon receipt of a verified request;
  • A consumer’s right to request that a business that sells the consumer’s personal information, or discloses it for a business purpose, disclose the categories of information that it collects and categories of information and the identity of any 3rd parties to which the information was sold or disclosed;
  • A consumer’s right to opt-out of the sale of personal information by a business prohibiting the business from discriminating against the consumer for exercising this right, including a prohibition on charging the consumer who opts-out a different price or providing the consumer a different quality of goods or services, except if the difference is reasonably related to value provided by the consumer’s data.

SB-1121’s amendments include:

  • A clarification to the definition of personal information: The data elements listed in the definition are personal information, not automatically, but to the extent that they identify, relate to, describe, are capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household.
  • An expansion of exempt information to include protected health information collected by a business associate governed by HIPAA/HITECH.
  • A clarification that personal information collected, processed, sold, or disclosed pursuant to the Gramm-Leach-Bliley Act, the California Financial Information Privacy Act, or the Driver’s Privacy Protection Act of 1994 is exempt regardless of whether the CCPA conflicts with these laws.
  • An exemption for information collected as part of a clinical trial subject to the Common Rule.
  • A clarification that information collected pursuant to the Gramm-Leach-Bliley Act and the Driver’s Privacy Protection Act of 1994 will not be exempt from a consumer’s cause of action relating to certain data breaches.
  • A clarification that a private cause of action exists only for data breaches and only if prior to initiating any action for statutory damages, a consumer provides a business 30 days written notice and opportunity to cure any violation. Notice is not required in an action solely for pecuniary damages.
  • Removal of a requirement for a consumer to provide notice of a private cause of action to the Attorney General.
  • Incorporation of a provision that businesses, service providers, or persons who violate the CCPA and fail to cure such violation within 30 days of written notice shall be liable – in an action brought by the state Attorney General – for a civil penalty of not more than $2,500 for each violation or $7,500 for each intentional violation.
  • An extension of the time for the Attorney General to adopt regulations from January 1, 2020 to July 1, 2020.
  • A provision that the Attorney General shall not bring an enforcement action under CCPA until 6 months after publication of the final implementation regulations or July 1, 2020, whichever is sooner.

With an effective date of January 1, 2020 (and regulations not yet proposed), it is expected that additional amendments will be negotiated, drafted, and published as consumers and industry groups advocate for additional changes.

Following on the heels of the European General Data Protection Regulation (“GDPR”) (See Does the GDPR Apply to Your U.S. Based Company?), the CCPA is a reminder that data privacy protection initiatives are spreading across the U.S. and globe. Brazil, India, Indonesia, and the Cayman Islands recently enacted, upgraded, or drafted comprehensive data protection laws. In May, Vermont passed a law requiring data brokers to implement a written information security program, disclose to individuals what data is being collected, and permit individuals to opt-out of the collection. In April, the Chicago City Council introduced the Personal Data Collection and Protection Ordinance, requiring opt-in consent from Chicago residents to use, disclose or sell their personal information. This fall, San Francisco is scheduled to vote on its “Privacy First Policy”, an ordinance requiring that businesses disclose their data collection policies to consumers as a predicate for obtaining city and county permits or contracts. On the federal level, several legislative proposals are being considered to heighten consumer privacy protection, including the Consumer Privacy Protection Act, and the Data Security and Breach Notification Act.

Given this legislative climate, it is important for organizations to continue developing a set of best practices to ensure the privacy and security of the personal information they collect, use, or store. Key to this process is creating a data inventory to identify what personal information is collected, how it is used, where it is stored, and when it is destroyed. Once this “data mapping” is complete, attention should be directed to drafting and implementing a written information security program (WISP). WISPs detail the administrative, technical and organizational policies and procedures an organization follows to safeguard the privacy and security of its data. These initial steps will help any organization identify and streamline its data processing activities, reduce its exposure in the event of a data breach, and prepare itself for the effective date of CCPA and future data protection legislation.

Hurricane Florence – Another Reminder to Develop a Disaster Recovery Plan

As with prior hurricanes, Florence is a reminder to all organizations of the importance of disaster recovery planning. When a storm approaches, a business’s first concern is protecting its employees/customers, and then its physical property. However, we shouldn’t forget that a natural disaster can also destroy information and technology assets critical to its success and continuity. Key steps to prepare and respond to a natural disaster can help minimize the blow. There are many aspects to comprehensive disaster recovery planning.

Below are some recommended best practices for an effective disaster recovery plan:

  1. Build the Right Team. Companies should be clear about what they are setting out to do and involve the appropriate segments of their organizations. Disasters do not just affect IT departments, they also affect the sales force, human resources, legal, finance, and management. Leadership from these and other business segments need to be at the table to ensure, among other things, appropriate coordination among the segments and an awareness of all available company resources. Excluding critical segments from the process will make it difficult to carry out the next critical step – assessing the risks. The IT department, whether internal or through a third-party vendor, must be well versed in disaster response.
  2. Conduct a Risk Assessment. Before an organization can develop a disaster recovery plan, it must first identify the information and technology assets it needs to protect, their locations, their role in the success of the business, their associated costs and the overall and specific risks that apply to those assets. Different disasters pose different risks and require different safeguards. It also is important to analyze how the organization’s operations would be affected upon the loss of vital components and assets, including identifying what information and technology systems are needed to safely keep the doors open.
  3. Employee/Customer Safety. Information and technology assets are critically important, but not at the expense of human life. Employees should be provided with guidelines on how to ensure their safety and that of customers, and be reminded that personal safety comes first.
  4. Develop a Plan. Having involved key personnel and assessed the risks, the organization is in a position to develop an enterprise-wide disaster recovery plan. The disaster recovery plan should be in writing and include the following:
    • Keep it short. If your plan is too long, it will be difficult to absorb particularly in a difficult situation.
    • Backup regularly and keep backups off site, in a safe location. Frequent and regular backups are critical to ensuring the preservation of important organization data, as well as the data it may maintain for others. A safe location also is critical. If a data center in lower Manhattan is underwater, being able to switch to another in California, Texas or the cloud will be essential to business continuity. The same is true for voice and electronic communications systems. Having critical business data replicated and stored off-site is a good “insurance policy” for any organization.
    • Data Encryption. Encryption of sensitive and/or critical business data will prevent unauthorized users from gaining access and limit exposure.
    • Don’t neglect laptops/mobile devices. Recovery plans tend to focus on the data center, however approximately two thirds of corporate data exists outside the data center. Moreover, laptops/mobile devices are far less resilient, for example, than data center servers.
    • Employee Training. No one likes fire drills, but they serve a valuable purpose. Make your employees aware of the risks and steps they must take in case of a disaster.
    • Test for recovery. Perform random recovery tests periodically. Audit the test, and confirm that all your data is recovered.
  5. Practice the Plan. When disaster strikes, the organization’s disaster recovery team will have to move quickly. Preparedness, therefore, is key. To be prepared, organizations should practice their plans to ensure personnel are ready to go.
  6. Update the Plan. As your organization changes, grows, and adds locations and new people, the disaster recovery plan also may need to change. A regular review of the plan is critical.

So, as you clean up from Florence or think about how your organization might be similarly vulnerable, assess whether your disaster recovery plan meets your needs. If not, make appropriate changes. If you think your business could have benefited from such a plan, there is no time like the present to develop one.

California May Lower the Standing Threshold in Data Breach Litigation

A key issue for any business facing class action litigation in response to a data breach is whether the plaintiffs, particularly consumers, will have standing to sue. Standing to sue in a data breach class action suit, largely turns on whether plaintiffs establish that they have suffered an “injury-in-fact” resulting from the data breach. Plaintiffs in data breach class actions are often not able to demonstrate that they have suffered financial or other actual damages resulting from a breach of their personal information. Instead, plaintiffs will allege that a heightened “risk of future harm” such as identity theft or fraudulent charges is enough to establish an “injury-in-fact”.

Federal circuits court over the past few years have struggled with the question whether plaintiffs in a data breach class action can establish standing if they only allege a heightened “risk of future harm”.  For example,the 3rd6th, 7th,  11th, and D.C. circuits have generally found standing, while the 1st2nd4th5th8th and 9th circuits have generally found no standing where a plaintiff only alleges a heightened “risk of future harm”. This circuit court split is in large part to due to lack of clarity following the U.S. Supreme Court’s decision in Spokeo, Inc. v. Robins which held that even if a statute has been violated, plaintiffs must demonstrate that an “injury-in-fact” has occurred that is both concrete and particularized, but which failed to clarify whether a “risk of future harm” qualifies as such an injury.

California Senate Tackles Issue of Standing in Data Breach Class Action Suits

While businesses await the U.S. Supreme Court to address this issue, it looks like the California legislature may take matters into its own hands. Senator Bill Dodd (D.) recently introduced a bill, S.B. 1121 Personal Information (an amendment to the California Customer Records Act) that would allow consumers to sue a business in response to a data breach without any showing of harm at all. The California Senate recently passed the bill in a vote of 22-13, after accepting an amendment from the Assembly to create a safe harbor for businesses that protect consumer’s personal data. The bill now moves to the California Assembly that must vote on the bill by August 31st. If the bill passes the Assembly, Governor Jerry Brown will have 30 days to sign or veto the bill.

Key Aspects of the S.B. 1121 Personal Information Include:

  • Each consumer could recover damages in an amount of not less than $200 and not greater than $1,000 per incident or for actual damages, whichever sum is greater.
  • Defines “breach” as “unauthorized access, use, modification, or disclosure of personal information.”
  • Consumers would have up to 4 years to sue for violation of the California Customer Records Act if their personal information was breached.
  • The current California Customer Records narrowly defines “customer” as an individual who provides personal information to a business for the purpose of purchasing or leasing a product or obtaining a service from the business. This bill would instead make those provisions applicable to consumers and consumer records, and define “consumer” for purposes of those provisions broadly to include any natural person.
  • A safe harbor for businesses that have implemented and maintained reasonable security procedures and practices appropriate to the nature of the information.

Response to Senator Dodd’s Bill

 S.B. 1121 Personal Information if passed would substantially lower (if not eliminate) the standing threshold in data breach consumer class action lawsuits. While consumer groups including the Consumer Attorneys of California, the California Public Interest Research Group, and others have come out in support, business organizations are, strongly opposed to the bill. Opposition includes a coalition of over 70 groups (and growing) including the

Senator Dodd in his introduction of S.B. 1121 stressed the importance of providing consumers a measure to sue following a data breach of their personal information, however Senator Dodd has said he is open to amendments of the bill to prevent “a mecca for lawsuits when no harm has been done”.

Takeaway

S.B. 1121 Personal Information is only one example of a wider trend in both the state and federal legislatures attempting to provide greater protection to consumer’s personal information, in response to both large-scale breaches, and the E.U.’s General Data Protection Regulation. Recent amendments strengthening state data breach notification laws (e.g. Louisiana, Colorado, Arizona, South Dakota and Alabama) and federal legislative proposals such as the Consumer Privacy Protection Act of 2017 or the Data Security and Breach Notification Act (see our blog post Senate Bill Introduced to Protect Personally Identifiable Information) are further indications of this growing trend.

You’re Gonna Need a Warrant for That….

On June 22, 2018, in Carpenter v. United States, the United States Supreme Court decided that the federal government would need a warrant in order to obtain historical location data from cellular service providers, based on cell tower “pings.” (“Pings” are more formally referred to as cell-site location information or “CLSI.”) As explained in more detail below, the issue at the center of the controversy in the Carpenter case was whether an individual’s personal location (as reflected in the CLSI) was private information protected by the Fourth Amendment, or whether any expectation of privacy was revoked because the location information was shared with the cell service provider when the individual’s cell phone accessed different cell towers. This decision was by a divided court (5-4), with four separate dissenting opinions (in other words, the Court had a lot to say on this).

A bit of background on the laws that were relevant to the Court in the Carpenter case (because the Magic 8 Ball is predicting that as technology continues to be a critical aspect of our personal and business lives, there will continue to be legal activity on the issue of what is private vs. what is shared). The Fourth Amendment of the U.S. Constitution provides protections to the people of the United States to “be secure in their persons, houses, papers, and effects, against unreasonable searches and seizures,” and that “no warrants shall issue, but upon probable cause, supported by oath or affirmation, and particularly describing the place to be searched, and the persons or things to be seized.”

The Stored Communications Act (“SCA”) is one of the titles included in the Electronic Communications Privacy Act (“ECPA”). The ECPA (including the SCA) was codified in 1986. At that point in time, most people didn’t own cell phones, and if they did, they didn’t turn them on. (I only carried mine as a potential means of defense, as it was substantial enough to knock out a potential attacker (without the screen breaking).)   As the Carpenter decision notes, however, “[t]here are 396 million cell phone service accounts in the United States – for a Nation of 326 million people.” While the SCA has been amended since 1986, it is difficult for statutory and case law to keep up with the lightning speed of technology.

The SCA makes it unlawful to access or disclose stored electronic communications records, unless the government compels such disclosure as allowed by the statute. Some of the ways the government may compel disclosure include through an issued warrant, an authorized administrative subpoena or a court order that shows “specific and articulable facts” that show the information may be relevant to a criminal investigation. See, 8 USC §2703.

Now on to the facts….the Carpenter case involved a criminal investigation by the FBI into a series of robberies in Detroit, Michigan. Federal judges issued court orders requiring two national cell phone providers to provide CLSI for incoming and outgoing calls, both for the time the call started, and the time the call ended. This CLSI placed Mr. Carpenter near four of the robberies, and he was charged and convicted.

The use of the CLSI in criminal investigations is where you see many of the cases on this type of issue; however, the rights of the government to obtain these records – or other use of the records — could have other implications. For example, this information can be used for other helpful purposes, such as to locate missing children or abducted individuals, or to track and locate terrorism suspects. It has also been used for purposes of tracking the location of individuals in state income tax audits, in order to determine if statutory residency tests have been met (which can impact businesses due to the potential negative impact on C-level employees who reside in a state other than where their principal office is located).

The Supreme Court found that the CLSI information was “intimate” data, which does more than simply show movements, but also shows “’familial, political, professional, religious and sexual associations.’” Moreover, this type of data is more personal than GPS attached to a car, as it travels with the individual and therefore accompanies an individual to the residence, physician’s office, and other “potentially revealing locations.” And, because it is stored for years, it provides a chronicled history of an individual’s actions (unlike a public viewing of someone, which is a one-time event). The Court found this to be significant because courts should consider what kind of information is sought in making a determination whether or not an individual would legitimately expect the information to be private.

This ruling, however, was expressly stated to have narrow application. The Court advised that it did not apply to other types of business records that may “incidentally” include location information, and may not even apply to protect all CLSI. The opinion of the Court noted “[t]he Government will be able to use subpoenas to acquire records in the overwhelming majority of investigations. We hold only that a warrant is required in the rare case where the suspect has a legitimate privacy interest in records held by a third party.”

So, at this point, it seems clear that the FBI cannot access historical, chronicled, CLSI records such as those obtained for Mr. Carpenter, in a criminal investigation, without a warrant. But for all of the other potential uses of this type of data? That Magic 8 Ball is stuck on “Reply Hazy, Try Again.”

Fourth Circuit Weighs in on Standing in Data Breach Litigation

Cybersecurity incidents are on the rise, and so too is data breach litigation brought by plaintiffs who allege they were harmed by the unauthorized exposure of their personal information. Federal circuits across the United States are grappling with the issue of what satisfies the Article III standing requirement in data breach litigation, when often only a “risk of future harm” exists.

The United States Court of Appeals for the Fourth Circuit (“the Fourth Circuit”) is the latest circuit court to weigh in on standing in data breach litigation. In Hutton v. National Board of Examiners in Optometry, the court held that the plaintiffs satisfied the Article III standing requirement by alleging hackers stole and misused their personally identifiable information (PII), even though no financial loss was incurred. Circuit courts have been split on the issue of standing in the data breach context, with some courts finding standing where only a heightened “risk of future harm” exists, i.e. the likelihood that stolen data may be misused (Sixth, Seventh, and Ninth Circuits), while other circuit courts require actual harm such as financial loss (Second, Third, and Eighth Circuits). The Fourth Circuit in Hutton has reached a middle ground finding that actual theft and misuse of the PII satisfied the standing threshold, even though no pecuniary damages resulted.

In Hutton, the plaintiffs, members of the National Board of Examiners in Optometry (NBEO), noticed that credit card accounts were fraudulently opened in their names, which required knowledge of their social security numbers and dates of birth. Although the NBEO never admitted to a security breach, plaintiffs concluded that the NBEO was the only common source to which they had provided their personal information. As a result, plaintiffs filed a lawsuit alleging the NBEO failed to adequately safeguard their personal information.

The NBEO filed a motion to dismiss arguing that although fraudulent credit card accounts were opened, no actual harm had occurred, and thus the plaintiffs lacked Article III standing to sue. The U.S. District Court for the District of Maryland granted the NBEO’s motion, finding, inter alia, that plaintiffs failed to sufficiently allege they had suffered an “injury-in-fact” because they had incurred no fraudulent charges and had not been denied credit or required to pay a higher credit rate as a result of the fraudulent credit card accounts.

The Fourth Circuit, however, reversed the district court’s holding, concluding that credit card fraud and identity theft alone were sufficient to establish Article III standing. The court distinguished Hutton, from their ruling in Beck v. McDonald, in which the court concluded that the plaintiffs lacked standing because they only alleged a “threat of future injury” – laptops and boxes were stolen containing personal information, but that information was not misused. In Hutton, the court emphasized, unlike in Beck, plaintiffs were “concretely injured” as credit card accounts were open without their knowledge or approval, qualifying as misuse, even if fraudulent charges were yet to occur.

The circuit court split on the issue of Article III standing has made it difficult for businesses to assess the likelihood of litigation and its associated costs in the wake of a data breach. Until the Supreme Court weighs in on this issue, it is crucial for businesses to assess their breach readiness and develop an incident or breach response plan that takes into consideration the possibility of litigation.

California Governor Signs Into Law Groundbreaking Consumer Protection Law

As we reported earlier this week, California legislature Democrats reached a tentative agreement with a group of consumer privacy activists spearheading a ballot initiative for heightened consumer privacy protections, in which the activists would withdraw the existing ballot initiative in exchange for the California legislature passing, and Governor Jerry Brown signing into law, a similar piece of legislation, with some concessions, by June 28th, the final deadline to withdraw ballot initiatives.

And as agreed upon, yesterday, hours before the deadline to withdraw ballot initiatives, the California legislature passed, and Governor Jerry Brown signed into law the groundbreaking California Consumer Privacy Act of 2018 (AB 375).

Consumer Privacy activists were pleased with the swift passage of the new law, “This is a milestone moment for privacy law in the United States,” Marc Rotenberg, Executive Director of the Electronic Privacy Information Center said in a statement. “The California Privacy Act sends a powerful message that people care about privacy and that lawmakers will act.”

Opponents, including the Internet Association, a lobbying group representing major technology companies, were less than pleased, “It is critical going forward that policymakers work to correct the inevitable, negative policy and compliance ramifications this last-minute deal will create for California’s consumers and businesses alike,” the group said in a statement.

A comprehensive update detailing key aspects of the new California law is available here.

 

LexBlog