“Cloud computing” takes many forms, but, fundamentally, it is a computer network system that allows consumers, businesses, and other entities to store data off-site and manage it with third-party-owned software accessed through the Internet. Files and software are stored centrally on a network to which end users can connect to access their files using computers that are less powerful and sophisticated than those we use today.  This technology reduces the need for expensive multiple servers and PCs with enough capacity to store massive data and application files. Some believe the PC of the future will need simply the capacity to connect to a web browser for the user to access his or her applications and files.

For more information on how cloud computing works, click here. For information on the FTC investigation of cloud computing, click here.

If you are not already computing in a cloud, you likely will be hearing more about “cloud computing” soon. Last month, for example, the City Council for the City of Los Angeles voted to move city employee e-mail and other applications from city computer networks to a cloud service provider – in this case, Google Inc. City officials cite significant cost savings (which they estimate to be in the millions) as one of the reasons for the switch. They acknowledged that concerns over data privacy, security and management remain.

We’ll agree that significant cost savings can be achieved through, among other things, reduced infrastructure. Questions and concerns many have with cloud computing, however, relate to the privacy, security and management of the information in the cloud. These include:

  • What if the cloud starts to rain – a cloud computing data breach – who is responsible for notifying affected persons (and bearing the costs)?
  • Which company owns the data placed in the cloud?
  • If the data in the cloud is employee e-mail, is the employer still permitted to access and monitor email communications? Will new policies/notices be needed?
  • Will company proprietary information be safe?
  • Who has access to the data? Who should have access?
  • Is the cloud service provider a business associate under HIPAA, prepared to comply with the HITECH Act? What other legal compliance requirements are there?
  • Do we still need to maintain a back-up of data in the cloud?
  • Where is the data stored? Is it in the United States, or in a foreign country subject to different data security standards? Does one location as opposed to another provide better access or security? What if data is stored in multiple places, will we be able to locate what we need when we need it?
  • How big is the cloud? How much can we store?
  • What if the cloud goes down? How do we get our data and access the applications needed to run our business?
  • How do we move between clouds? Can our data be held captive when contract negotiations fall through?
  • Can we put our clients’ data in the cloud? Do we have to tell them where it is?
  • What happens to the data if the cloud service provider or the cloud customer goes out of business?
  • Will applications in the cloud work the same way, be as flexible, and respond with the same speed as those on current PCs?

Organizations such as the Cloud Security Alliance have been formed to grapple with some of these issues. Indeed, the City of Los Angeles has had to respond to some of these concerns. So, while cloud computing may yield substantial cost savings and appear tempting, these and other questions and concerns should be addressed before moving in that direction.

The annual Cost of a Data Breach Report (Report) published by IBM is reliably full of helpful cybersecurity data. This year is no different. After reviewing the Report, we pulled out some interesting data points. Of course, the Report as a whole is well worth the read, but if you don’t have the time to get through its 78 pages, this post may be helpful.

What is new in the Report. There are several new items covered by the Report. The two that caught our eye:

  • Is it beneficial to involve law enforcement in a ransomware attack? According to the Report, organizations that did not involve law enforcement in a ransomware attack experienced significantly higher costs, as much as $470,000. Nearly 40% of respondents did not involve law enforcement. In our experience, involvement of law enforcement can have significant benefits, including greater insight into the behavior of certain threat actors. Such insight can speed up efforts to contain the attack, reducing costs in the process.
  • What are the effects of ransomware playbooks and workflows? In short, it turns out the effects are good. Having playbooks and workflows for ransomware attacks help to reduce response time and minimize costs. In fact, the benefits of incident response planning are not limited to ransomware. Organizations we encounter that have a robust incident response program are significantly more prepared to identify and response to an incident. An incident response plan generally means having a dedicated team, maintaining a written plan, and practicing that plan. Incident response plans can be particularly important for healthcare entities, which have experienced a 53% increase in data breach costs since 2020.   

AI has many benefits, including controlling data breach costs. There are two significant drivers of data breach costs – time to detect and time to contain. Shortening one or both of these can yield substantial costs savings when dealing with a data breach. According to the Report, the extensive use of security AI and automation resulted in reducing breach detection and containment by 108 days on average, and nearly $2 million in cost reduction. Even limited use of AI shortened the response time by 88 days, on average.

AI-driven data security and identity solutions can help drive a proactive security posture by identifying high-risk transactions, protecting them with minimal user friction and stitching together suspicious behaviors more effectively.

Healthcare continues to be the leader in data breach costs. Second place, the financial services industry, is not even close, according to the Report. Healthcare (hospitals and clinics), with an average cost of a data breach at $10.9 million, nearly doubles the cost of organizations in financial services, $5.9 million. Susan Rhodes, the acting deputy for strategic planning and regional manager for the Office for Civil Rights at HHS, recently observed that ransomware attacks are up 278% in the last 5 years.

Smaller organizations faced significant data breach cost increases, while larger organizations experienced declines. We have written a bunch here on the data security and breach risks of small organizations. For the three categories of smaller organizations measured by the Report – fewer than 500 employees, 500-1,000 employees, and 1,001-5,000 employees – all experienced double digit percentage increases, with the larger two categories having a greater than 20% increase in costs. It is difficult to pinpoint the reasons for this disparity. However, it may be that small organizations are less likely to engage in the kinds of activities that tend to minimize data breach costs, such as incident response planning and using security AI. We also find that smaller organizations tend to view themselves as not a target of cyber criminals.

Perhaps one of the more instructive parts of the Report is Figure 16 on page 28 which illustrates the impact certain factors can have on the average cost of a breach. The top four factors that appear to drive down data breach costs include integrated security testing in software development (DevSecOps), employee training, incident response planning and testing, and AI. Factors that tend to increase costs on average include remote workforce, third party involvement, noncompliance with regulations, and security system complexity.  

Since 2021, detection and escalation costs hold the top category of data breach costs, including over business interruption.  When one thinks of data breach-related costs, one may be tempted to guess the costs of notification. But it is actually the lowest of the four categories, according to the Report, although that category has more than doubled since 2018. Beginning in 2022, detection and escalation costs took the top spot. These costs include “forensic and investigative activities, assessment and audit services, crisis management, and communications to executives and boards.”  

Overall, the Report is filled with additional insights concerning the costs of a data breach. Here are some quick takeways that could help your organization minimize these costs:

  • Develop, implement, and practice an incident response plan,
  • Train employees,
  • Implement AI, even a little,
  • Comply with applicable regulations, and
  • Strengthen vendor security assessment and management programs, cloud service providers in particular.

On June 6, 2023, Governor DeSantis signed Senate Bill (SB) 2262, legislation intended to create a “Digital Bill of Rights” for Floridians. While Florida’s new law provides similar privacy rights to consumers as other states’ comprehensive privacy laws passed in recent months, the law is narrower in the businesses that are regulated.

Generally, the requirements of the law take effect on July 1, 2024, with certain sections taking effect sooner.

Covered Businesses

The new legislation applies to businesses that collect consumers’ personal information, make in excess of $1 billion in gross revenues, and meet one of the following thresholds:

  • Derive 50% or more of its global annual revenues from providing targeted advertising or the sale of ads online; or
  • Operate a consumer smart speaker and voice command component service with an integrated virtual assistant connected to cloud computing service that uses hands-free verbal activation.

Consumer Rights

Like many of the comprehensive privacy laws passed in recent months, the new law provides Florida consumers the right to:

  • Access their personal information;
  • Delete or correct personal information; and,
  • Opt out of the sale or sharing of their personal information.

In addition to these rights, the law adds biometric data and geolocation information to the definition of personal data, for purposes of protecting consumers.

Covered Business Obligations

Under the new law, covered businesses and their processors are required to implement a retention schedule for the deletion of personal data. Controllers or processors may only retain personal data until:

  • The initial purpose of the collection was satisfied;
  • The contract for which the data was collected or obtained has expired or terminated; or
  • Two years after the consumer’s last interaction with the covered business.

Covered businesses will be required to provide reasonably accessible and clear privacy notices, and such notices will need to be updated annually, including disclosures to consumers regarding data collection, processing, and use practices.  

The law also requires covered businesses to develop and implement reasonable data security practices.

If you have questions about Florida’s new Digital Bill of Rights or related issues, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

In July 2020, the Court of Justice of the European Union (CJEU) declared the EU-U.S. Privacy Shield invalid. The EU-U.S. Privacy Shield program was designed to provide European Economic Area (EEA) data transferred to the U.S. with a level of protection comparable to EU law. The CJEU invalidated the program stating that U.S. companies could not provide an essentially equivalent level of protection based on the breadth of U.S. national security surveillance laws, FISA 702, E.O. 12.333, and PPD 28. In the wake of the decision, businesses relying on the EU-U.S. Privacy Shield as an adequate transfer mechanism to perform routine activities such as sending employee data from the EEA to U.S. headquarters for HR administration, accessing a global HR database from the U.S., remotely accessing EEA user accounts from the U.S. for IT services, providing EEA data to third party vendors for processing in the U.S., or relying on certain cloud-based services were forced to rely on alternate mechanisms including standard contractual clauses.

On October 7, 2022, President Biden signed an Executive Order that outlines steps the U.S. government will take to implement a new EU-U.S. data privacy framework, the Trans-Atlantic  Data Privacy Framework, to replace the invalidated EU-U.S. Privacy Shield.

The new Framework is designed to restore a legal basis for transatlantic data flows and addresses concerns raised in the CJEU decisionby strengthening privacy and civil liberties protections for foreign individuals and creating an independent and binding process for non-U.S. citizens to seek redress if they believe their personal data was improperly collected through U.S. signals intelligence. Signals intelligence activities involve collecting foreign intelligence from communications and information systems. 

The Executive Order is the first step toward rebuilding the EU-U.S. data protection program. Over the next few months, the EU Commission will review the framework and if satisfied with the proposed safeguards and protections for EU data and individuals, issue an “adequacy decision” that concludes data transferred to the U.S. will receive an essentially equivalent level of protection. While legal challenges to this new framework are anticipated, the Executive Order demonstrates a U.S. commitment to addressing EU concerns regarding data protection. It also provides an incentive to U.S. organizations to maintain their EU-US Privacy Shield certification in hopes it can be leveraged under the new framework.

If you have questions about the effect of the Executive Order on your business or related issues contact the Jackson Lewis attorney with whom you regularly work or a member of our Privacy, Data, and Cybersecurity practice group.

On February 23, 2022, the EU Commission published a Proposal for a Regulation on harmonized rules on the access to and use of data as part of its strategy for making the EU a leader in the data-driven society. The “Data Act” addresses the access, use and porting of “industrial data” generated in the EU by connected objects and related services.  The Act further ensures this data will be shared, stored and processed in accordance with EU rules, including when the dataset contains personal data.

Scope

The proposed Regulation applies specifically to data from the usage of connected objects and related services (e.g., software). Data means any digital representation of acts, facts or information including in an audio, visual or audio-visual format. While the Regulation applies to data derived from usage and events, it does not apply to information derived or inferred from this data.

Connected devices (i.e., IoT) include vehicles, home equipment, consumer goods, medical and health devices, and agricultural or industrial machinery (i.e., IoT) that generate performance, usage or environmental data. Products designed primarily to display, play, record, or transmit content such as personal computers, servers, tablets, smart phones, cameras, webcams, sound recording systems, and text scanners are not covered by the Act.

The Regulation applies to (a) manufacturers of products and suppliers of related services placed on the market in the Union (b) users of such products or services; (b) data holders that make data available to data recipients in the Union; (c) data recipients in the Union to whom data are made available; (d) public sector bodies and Union institutions, agencies or bodies that request data holders to make data available where there is an exceptional need for the performance of a task carried out in the public interest and the data holders that provide those data in response to such request; and (e) providers of data processing services offering such services to customers in the Union.

Relevant Provisions

  • Manufacturers and designers must provide consumers and businesses with access to and use of data derived from utilization of connected devices they own, rent or lease as well as related services. This is data that is traditionally captured and held by the manufacturer or designer and the device owner’s right to the data is often unclear. Under the Act, the device owner will be able to use the data for after-market purposes. For example, a car owner might share usage data with their insurance company, or a business owner might use data from a connected manufacturing device to perform its own maintenance in lieu of using the manufacturer’s services. In support of these measures, manufacturers and designers must disclose what data is accessible and design products and services so the data is easily accessible by default.
  • Data sharing agreements between parties must avoid contractual terms that place SMEs at a disadvantage. The Act includes a test to assess the fairness of the contractual terms. The EU Commission plans to develop and publish non-binding model contract terms to help achieve this goal.
  • Cloud service providers must adopt portability measures that permit consumers and businesses to move data and applications to another provider without incurring and costs. The Act also mandates implementation of safeguards to protect data held in cloud infrastructures in the EU.
  • Customers shall have the right to transfer data from one data processor to another, free of commercial, technical, contractual or organizational obstacles.
  • Businesses shall provide certain data to public sector bodies in exceptional situations (e.g., public emergencies), under key conditions.
  • Cloud service providers will be subject to certain restrictions on international data sharing or access.
  • The content of certain databases resulting from data generated or obtained by connected devices will be protected.

Next Steps

The proposed Regulation is designed to stimulate competition and create opportunities for data-driven innovation as part of the EU’s data strategy. In doing so, it complements the Data Governance Act, which facilitates data sharing across sectors and Members states. As the EU continues to strengthen its data strategy, U.S. businesses will want to monitor this space and consider preliminary steps towards potential compliance. The Regulation will apply to U.S. manufacturers and service providers who place connected objects and related services in the EU market. Compliance will necessitate appropriate policies, procedures, and mechanisms to meet the Regulation’s transparency, access, data minimization and safeguards mandates. At a minimum, this will involve designing and manufacturing products and services that incorporate user access mechanisms and protections by design and default.

The leaders of our Wage & Hour Practice, Justin Barnes Jeffrey Brecher and Eric Magnus collaborated with us on this article.

According to reports, Kronos, the cloud-based, HR management service provider, suffered a data incident involving ransomware affecting its information systems. Kronos communicated that it discovered the incident late on Saturday, December 11, 2021, when it “became aware of unusual activity impacting UKG solutions using Kronos Private Cloud.”   Shortly after,  Kronos issued a helpful Q & A for customers impacted by the incident. The company confirmed:

[T]his is a ransomware incident affecting the Kronos Private Cloud—the portion of our business where UKG Workforce Central, UKG TeleStaff, Healthcare Extensions, and Banking Scheduling Solutions are deployed. At this time, we are not aware of an impact to UKG Pro, UKG Ready, UKG Dimensions, or any other UKG products or solutions, which are housed in separate environments and not in the Kronos Private Cloud.

This incident has already impacted time management, payroll processing, and other HR-related activities of organizations using the affected services. Ransomware and similar attacks also could compromise confidential and personal information maintained on affected systems, although there is no indication of that at this point. Clearly, organizations that use these services can be affected in several ways. The FAQs below provide information on some of the key issues these organizations should be thinking about.

Isn’t this really Kronos’ problem?

This certainly is a significant issue for Kronos and, based on communications from Kronos, the company is in the process of remediating the incident and alerting its impacted customers. However, because of the nature and extent of the services Kronos provides to its customers (i.e., employers), there are several issues that HR, IT and other groups inside organizations that are customers of the affected services need to be doing. We address some of those items below.

From a communications perspective, this incident likely will receive significant news coverage, prompting questions from employees about the impact of the incident on their personal information, their schedules, their pay, etc. Employers will need to think carefully about how to respond to these inquiries, especially when there is little known at this point about the incident.

From a compliance perspective, employers should be reviewing and implementing their contingency plans depending on the scope of services received from Kronos. For example, clients using Kronos time management systems should be evaluating what measures they should be implementing to ensure their employees’ time is properly captured and paid. A company has a legal obligation to accurately track hours worked, regardless of whether their third-party vendor (like Kronos) responsible for the task can do so or not. Clients might want to institute, in the short-term, paper timekeeping and tracking systems to ensure that employees are taking appropriate breaks and being paid for all time worked. It would be especially helpful in this situation to have employees sign off that the amount of time they report and the breaks they took are accurate.

From a cybersecurity standpoint, the answer to the question of whether this is only Kronos’ problem likely is no. All 50 states, as well as certain cities and other jurisdictions, have breach notification laws. If there is a breach of security under those laws, there may be a notification obligation. The notification obligation to affected individuals largely rests with the owner of that information, which likely would be employers. We anticipate that if notification is required, Kronos may take the lead on that, although employers will want some assurances that notification will be provided in a time and manner consistent with applicable law.

What should we be doing?

There are several steps employers likely will need to take in response to this incident, not all of which are clear at this point because of what little is currently known. Still, there are some action items affected employers should be considering:

  • Stay informed. Closely follow the developments reported by Kronos, including coordinating with your HR and IT teams.
  • Consult with counsel. Experienced cybersecurity and employment counsel can help employers properly identify their obligations and coordinate with Kronos, as needed.
  • Communicate with employees. Maintaining accurate and consistent communications with employees is critical, especially considering a significant part of the discussions around this incident could be taking place in social media. Your employees and their representatives, where applicable, may already be aware of this incident. To be prepared to address and respond to employee concerns, organizations should consider providing an initial short summary of the incident to potentially impacted individuals as soon as possible. That communication could be expanded over time with more information as it come available, perhaps in the form of FAQs like these. Less is more on the initial communication, again, given what little is known. However, it is important to let employees know the organization is aware of the incident and actively taking steps to mitigate its effects on employees.
  • Review Your Kronos Services and Service Agreement. Begin evaluating the services that the organization receives from Kronos. This will help to implement contingency plans, but also to assess the nature and extent of the information that Kronos maintains on the organization’s behalf. The organization might be able to conclude early on that, while there may be impacted systems and operations, Kronos was not in possession of the kind of personal information pertaining to employees of the organization that could lead to a breach notification obligation. This information could be reassuring for employees. Also, review the services agreement between the organization and Kronos as it may include provisions that have particular relevance here. For example, the agreement may outline a process agreed to between the parties for handling data incidents like this.
  • Review your cyber insurance policy. It might be premature to make a claim against the organization’s cyber policy, assuming the organization has a cyber policy – an important consideration nowadays. But, key stakeholders should review the situation and discuss potential coverage options with the organization’s insurance broker and/or legal counsel. Becoming more familiar with existing cyber insurance policies and coverage is prudent as it might cover some of the costs an organization incurs in connection with incidents like this.
  • Evaluate vendors. What some are asking may have led to the Kronos incident is the “Log4j” vulnerability, however, that has not been confirmed at this time. Log4j is described as a Java library for logging error messages in applications. Because other vendors also may have Log4j exposure, organizations may want to use this incident as a reason to examine more closely the data privacy and security practices of other third-party vendors, regardless of whether the Log4j vulnerability was exploited here. This is particularly the case for those vendors that handle the personal information of employees and customers.
  • Revisit your own data security compliance measures. Organizations also should check their own systems for Log4j and other vulnerabilities and fix them as quickly as possible.

Will the state breach notification laws apply?

We do not know if there has been a “breach” at this point. This will require investigation and analysis of the incident, which we understand is underway at Kronos at this time. However, if the incident affects certain unencrypted personal information of individuals, such as names coupled with social security numbers, drivers’ license numbers, financial account numbers, medical information, biometric information or certain other data elements, state breach notification laws may apply. Organizations that utilize Kronos’ services globally must consider a broader definition of personal data, such as under the General Data Protection Regulation (GDPR).

Thousands of organizations have suffered similar attacks, all of which illustrate the importance of planning for a response, not only trying to prevent one. Third party service providers play important roles for most organizations, particularly with regard to their HR systems and corresponding operations. It will take some time to work through this incident, but it should be a reminder for all affected organizations to continue to develop, refine, and practice their contingency plans.

URL

In April, we posted about the U.S. Department of Labor’s (DOL) Employee Benefits Security Administration (EBSA) issuing cybersecurity guidance for employee retirement plans. That is, April 14, 2021. Shortly thereafter, the DOL updated its audit inquiries to include probing questions for plan fiduciaries about their compliance with “hot off the press” agency guidelines.

So, what do those inquiries look like?

In short, the DOL is asking plan sponsors to produce:

all documents relating to any cybersecurity or information security programs that apply to the data of the Plan, whether those programs are applied by the sponsor of the Plan or by any service provider of the Plan

For plan fiduciaries that are new to cybersecurity and have not received a DOL audit in the last few months, it may not be clear what documents or materials the DOL is expecting. The DOL fleshes out its general inquiry with a laundry list of items. Here are some examples of those more specific requests:

  • All policies, procedures, or guidelines relating to such things as:
    • The implementation of access controls and identity management, including any use of multi-factor authentication
    • The processes for business continuity, disaster recovery, and incident response.
    • Management of vendors and third party service providers, including notification protocols for cybersecurity events and the use of data for any purpose other than the direct performance of their duties.
    • Cybersecurity awareness training.
    • Encryption to protect all sensitive information transmitted, stored, or in transit.

The list above is not complete, but it makes clear the DOL is looking for information about what plan fiduciaries are doing to safeguard their own information and systems to address privacy and security, not just that of their service providers. Some plan fiduciaries might be wondering what should policies, procedures, or guidelines look like to protect plan data. There are many frameworks to consider when adopting reasonable safeguards. Examples include guidance published by the National Institute of Standards and Technology, the New York SHIELD Act, the Massachusetts data security regulations, the privacy and security standards under HIPAA, etc.

In addition to policies, procedures, and guidelines summarized above, the DOL also seeks in its audit request copies of other materials, some of which are listed below.

  • “All documents and communications relating to any past cybersecurity incidents.”

So, evidently, the DOL would like to discover whether the plan had a prior cybersecurity incident. It is unclear whether this request refers only to “breaches of security” or similar terms as defined under state breach notification laws which require notification, or mere “incidents” that do not rise to the level of a reportable breach.

  • “All documents and communications describing security reviews and independent security assessments of the assets or data of the Plan stored in a cloud or managed by service providers.”

Here the DOL makes a distinction between plan “assets” and plan “data,” seeking security reviews and assessments relating to both. Recent litigation called into question whether plan data could be considered a “plan asset.” In one of the most recent cases, Harmon v. Shell Oil Co., 2021 WL 1232694 (S.D. Tex. Mar. 30, 2021), the U.S. District Court for the Southern District of Texas rejected the argument that plan assets include plan data.

  • “All documents describing security technical controls, including firewalls, antivirus software, and data backup.”

An important note here is that it may not be enough to say, “we are doing this,” or “we have implemented antivirus and firewalls to protect our information systems.” The DOL is looking for documents that describe those safeguards and controls.

  • “All documents and communications from service providers relating to their cybersecurity capabilities and procedures.”
  • “All documents and communications from service providers regarding policies and procedures for collecting, storing, archiving, deleting, anonymizing, warehousing, and sharing data.”
  • “All documents and communications describing the permitted uses of data by the sponsor of the Plan or by any service providers of the Plan, including, but not limited to, all uses of data for the direct or indirect purpose of cross-selling or marketing products and services.”

The DOL would like to see how plan fiduciaries are communicating with their service providers to assess service provider cybersecurity risk, as well as the documents and other materials from service providers concerning the processing of plan data. Importantly, the DOL is not just looking for cybersecurity related information. The agency apparently wants to know how service providers are permitted to use plan data. Plan fiduciaries will want to think carefully about their current practices, including their communications, when selecting and working with service providers.

No plan fiduciary wants to experience a DOL audit of their retirement plans, or any other audit for that matter. But cybersecurity clearly is a new and important area of interest for the DOL and plan fiduciaries need to be prepared to respond. Feel free to contact us if you would like to discuss audit readiness concerning cybersecurity for your plans.

On May 12, 2021, the Biden Administration issued an Executive Order on “Improving the Nation’s Cybersecurity” (EO). The EO was in the works prior to the Colonial Pipeline cyberattack, reportedly a ransomware incident that snarled the flow of gas on the east coast for days. Ransomware attacks are nothing new, but they are increasing in severity. Most do not see the large sums paid to hackers by victim organizations needing access to their encrypted data or wanting to stop a disclosure of sensitive information if they can. But most do see the crippling of vital infrastructure caused by compromised computer systems without which basic services cease to flow.

Of course, the Colonial Pipeline incident is not the only attack we have seen affecting entities that provide to critical infrastructure. In February of this year, ABC News reported that weak cybersecurity controls “allowed hackers to access a Florida wastewater treatment plant’s computer system and momentarily tamper with the water supply,” based on a memo by federal investigators obtained by ABC. A month later, sensitive data were exposed for some time in cloud storage by New England’s largest energy provider, according to reports. The SolarWinds breach last year, named Sunburst, was a massive compromise of government agencies including the Department of Energy.

Will the EO help? It is unclear at this point, however, the EO makes a clear statement on the policy of the Administration:

It is the policy of my Administration that the prevention, detection, assessment, and remediation of cyber incidents is a top priority and essential to national and economic security.  The Federal Government must lead by example.  All Federal Information Systems should meet or exceed the standards and requirements for cybersecurity set forth in and issued pursuant to this order.

The effect of the EO will mostly affect the federal government and its agencies. However, several of the requirements in the EO will reach certain federal contractors, and also will influence the private sector. Below are several of the items directed by the EO:

  • Removing contractual barriers in contracts between the federal government and its information technology (IT) and operational technology (OT) service providers. The goal here is to increase information sharing about threats, incidents, and risks in order to accelerate incident deterrence, prevention, and response efforts and to enable more effective defense of government systems and information. As part of this effort, the EO requires a review of the Federal Acquisition Regulation (FAR) concerning contracts with such providers and recommendations for language designed to achieve these goals. Recommendations will include, for example, time periods contractors must report cyber incidents based on severity, with reporting on the most severe cyber incidents not to exceed 3 days after initial detection. The changes also will seek to standardize common cybersecurity contractual requirements across agencies.
  • Modernize approach to cybersecurity. To achieve this goal, some of the steps called for in the EO include adopting security best practices, advance to Zero Trust Architecture, move to secure cloud services, including Software as a Service (SaaS), and centralize and streamline access to cybersecurity data to drive analytics for identifying and managing cybersecurity risks. More specifically, the EO requires that within 180 days of the date of the EO, agencies must adopt multi-factor authentication and encryption for data at rest and in transit, to the maximum extent consistent with Federal records laws and other applicable laws.
  • Improve software supply chain security. Driven by the impact of the SolarWinds incident, the EO points to the lack of transparency in the software development and whether adequate controls exist to prevent tampering by malicious actors, among other things. The EO calls for guidance to be developed that will strengthen this supply chain, which will include standards, procedures, and criteria, such as securing development environments and attesting to conformity with secure software development practices. The EO also requires recommendations for contract language that would require suppliers of software available for purchase by agencies to comply with, and attest to complying with the guidance developed. Efforts also will be made to reach the private sector. For instance, pilot programs will be initiated by the Secretary of Commerce acting through the Director of NIST to educate the public on the security capabilities of Internet-of-Things (IoT) devices and software development practices, and shall consider ways to incentivize manufacturers and developers to participate in these programs.
  • Establishing a Cyber Safety Review Board. Among the Board’s duties would include reviewing and assessing certain significant cyber incidents affecting FCEB Information Systems or non-Federal systems, threat activity, vulnerabilities, mitigation activities, and agency responses.
  • Standardize incident response. Standardize the federal government’s response to cybersecurity vulnerabilities and incidents to ensure a more coordinated and centralized cataloging of incidents and tracking of agencies’ progress toward successful responses.
  • Improve detection. The EO seeks to improve detection of cybersecurity vulnerabilities and incidents on federal government networks.
  • Improving the federal government’s investigative and remediation capabilities. The Administration recognizes it is essential that agencies and their IT service providers collect and maintain network and system logs on federal information systems in order to address a cyber incident. The EO seeks recommendations on the types of logs to be maintained, the time periods to retain the logs and other relevant data, the time periods for agencies to enable recommended logging and security requirements, and how to protect logs. These recommendations will also be considered by the FAR Council when promulgating rules for removing barriers to sharing threat information.

It is expected the U.S. government will ramp up efforts to strengthen its cybersecurity, and we can expect states to continue to legislate and regulate in this area. All businesses, including federal contractors, likely will experience pressure to evaluate their data privacy and security threats and vulnerabilities and adopt measures to address their risk and improve compliance.

The SolarWinds hack highlights the critical need for organizations of all sizes to include cyber supply chain risk management as part of their information security program. It is also a reminder that privacy and security risks to an organization’s data can come from various vectors, including third party vendors and services providers. By way of example, the Pennsylvania Department of Health recently announced a data security incident involving a third-party vendor engaged to provide COVID-19 contact tracing. The personal information of Pennsylvania residents was potentially compromised when the vendor’s employees used an unauthorized collaboration channel.

Protecting against these risks requires maintaining and implementing a third-party vendor management policy, a critical and often overlooked part of an organization’s information security program.  Appropriate vendor management helps guard against threats to an organization’s data posed by authorized third parties who have direct or indirect access. Risks can include data breaches, unauthorized use or disclosure, and corruption or loss of data. These risks may come from vendors who provide cloud storage, SaaS, payroll processing or HR services, services using connected devices, IT services, or even records disposal.

Robust vendor management policies and practices typically involve three components: conducting due diligence to ensure the third party vendor or service provider with whom the organization shares personal information or to whom it discloses or provides access, implements reasonable and appropriate safeguards to ensure the privacy and security of that data; contractually obligating the third party vendor or service provider to implement such safeguards; and monitoring the third party vendor or service provider to ensure compliance with these contracted provisions.

While vendor management is a best practice, it is also required by certain U.S. federal laws including the Gramm-Leach-Bliley Act and HIPAAstate laws in Massachusetts, Illinois and California, and municipal laws such as the New York Department of Financial Services Cybersecurity Rules (NYCRR 500). In the EU, the European Data Protection Regulation (GDPR) specifically requires a data controller to only use processors (e.g., third party service providers) who provide sufficient written guarantees to implement appropriate technical and organizational measures that ensure the privacy and security of the controller’s personal data.

Aside from mandated vendor management practices, over twenty states including Florida, Texas, Massachusetts, New York, Illinois have laws requiring businesses that collect and maintain personal information to implement reasonable safeguards to protect that data. These states have been joined by the recently enacted California Privacy Protection Act (CPRA) and Virginia Consumer Data Protection Act (CDPA).  Although the majority of these statutes do not define reasonable safeguards, similar to data retention and storage limitations practices, vendor management practices may constitute a “reasonable safeguard.”

The Federal Trade Commission (FTC) took such a position in a Consent Agreement resolving alleged violations of the Gramm-Leach-Bliley Act (GLBA) Safeguards Rule. In its complaint, the FTC alleged several violations including a failure to take reasonable steps to select service providers capable of maintaining appropriate safeguards for personal information provided by the company and a failure to require service providers by contract to implement appropriate safeguards for such personal information. The Consent Agreement required the company to establish, implement, and maintain a comprehensive data security program that protects the security of certain covered information (i.e., reasonable safeguards). This requirement specifically includes selecting and retaining vendors capable of safeguarding company personal information they access through or receive from the company, and contractually requiring vendors to implement and maintain safeguards for such information.

Over recent months, companies have faced heightened risks to their information security from threat actors, increased remote work arrangements, and outsourced activities involving sensitive data. These threats, combined with a proliferation of proposed and enacted data protection laws, underscore the importance of implementing, maintaining, and monitoring a robust vendor management program.