Since March of this year, the Equal Employment Opportunity Commission (EEOC) has released guidance on a near-monthly basis addressing various FAQs concerning COVID-19 issues. The guidance has focused on disability-related inquiries, confidentiality, hiring, and reasonable accommodations under the Americans with Disabilities Act (ADA), as well as issues under Title VII of the Civil Rights Act and the Age Discrimination in Employment Act (ADEA). In its latest FAQ update posted yesterday, the EEOC covers some more practical questions employers have on several COVID-19 issues, such as testing, telecommuting, and sharing employee medical information.

COVID-19 Testing

As COVID-19 testing capabilities and resources have expanded, many employers across the country have been working on establishing testing protocols. Some still have concerns, however, about whether they are permitted to test, particularly considering the general ADA requirement that any mandatory medical test of employees be “job related and consistent with business necessity.”

The EEOC has already confirmed that employers may opt to administer COVID-19 testing to employees before initially permitting them to enter the workplace.  In the updated FAQs, the EEOC further clarified that periodic testing to determine if the employees  presence in the workplace is permissible to determine if the employee poses a direct threat to others. In its updated FAQs, the EEOC also sought to address updates to CDC guidance. Specifically, the EEOC made clear that employers administering COVID-19 viral testing consistent with current CDC guidance will meet the ADA’s “business necessity” standard, and that following recommendations by the CDC or other public health authorities regarding whether, when, and for whom testing or other screening is appropriate. The EEOC acknowledged that the CDC and FDA may revise their recommendations based on new information, and reminded employers to stay up to date.

More on What Employers Can Ask Employees, and If Employees Refuse to Answer

For several months, employers have been building COVID-19 screening programs – taking employee temperatures and asking questions about COVID-19 symptoms and travel, among other things – before permitting employees to enter the employer’s facilities. Some employers have continued to wonder whether they are permitted under the ADA to ask employees whether they have had a COVID-19 test. The EEOC confirmed in the updated FAQs that employers may ask if employees have been tested for COVID-19. Presumably, this also means that employers may ask if the employee’s test was positive or negative, but this is not clear in the updated EEOC FAQs.

Because the permissibility of certain COVID-related requests are based on the existence of a direct threat, asking employees about COVID-19 testing does not extend to employees who are teleworking and not physically interacting with coworkers or others (for example, customers). Asking employees about COVID-19 testing also does not extend to whether the employee’s family members have COVID-19 or symptoms associated with COVID-19. This is because the Genetic Information Nondiscrimination Act (GINA) generally prohibits employers from asking employees medical questions about family members. But, the EEOC clarified employers may ask employees whether they have had contact with anyone diagnosed with COVID-19 or who may have symptoms associated with the disease.

The EEOC also further addressed whether employers may focus screening efforts on a single employee – e.g., asking only one employee COVID-19 screening questions. In this case, the employer must have a reasonable belief based on objective evidence that this person might have the disease, such as a display of COVID-19 symptoms. However, employees working regularly or occasionally onsite and who report feeling ill or who call in sick may be asked questions about their symptoms as part of workplace screening for COVID-19, according to the EEOC.

During the summer, several states began to implement mandatory and recommended quarantines for persons arriving in their states from other states with high levels of community spread. The EEOC confirmed that employers do not have to wait until employees experienced COVID-19 symptoms before they may ask employees where they traveled as such questions would not be disability-related inquiries.

As several employers have learned, not all employees cooperate with employer-administered screening programs. When they object, employers should consider their options carefully and whether an accommodation may be necessary. The EEOC acknowledges that the ADA allows employers to bar employees from physical presence in the workplace if they refuse to have their temperature taken or refuse to confirm whether they have COVID-19, symptoms associated with COVID-19, or have been tested for COVID-19. Some employers desire to make compliance with screening programs a condition of employment, subjecting employees to termination from employment if they fail to comply. The EEOC did not discuss that option, however, the agency reminded employers they can gain cooperation by asking employees the reasons for their refusal. They also can offer information and/or reassurance that they are taking steps to ensure workplace safety, that the steps are consistent with health screening recommendations from CDC, and that the employer is careful about maintaining confidentiality.

Managers Sharing Information About Employees with COVID

It is not uncommon for managers to learn about the medical condition of employees they supervise. Because the ADA requires all employee medical information to be maintained confidentially, managers who discover an employee has COVID-19 may be unsure about what they may and/or should do with that information. The EEOC FAQS make clear that managers may report this information to appropriate persons in the organization in order to then comply with public health authority guidance, such as contact tracing. Employers should consider directing managers on where to report this information in order to minimize who receives it, and what to report. However, the EEOC clarified that it would not violate the ADA if a worker reported to her manager the COVID-19 status of a coworker in the same workplace.

Recognizing that coworkers in small workplaces might be able to identify which worker(s) triggered contact tracing efforts, the EEOC reminds employers they still may not confirm or reveal the employee’s identity. For employees that have a need to know this information about other employees, they should be specifically instructed to maintain the confidentiality.

Telework

Many employees continue to telework, particularly in occupations where it is feasible to do so. Being away from the office, however, does not eliminate these COVID-19 issues. For example, managers still have to maintain the confidentiality of employee medical information when they are working from home. This includes, where necessary, taking steps to limit access to the information until the manager can return to the office to store the information according to normal protocols. It also includes not disclosing the reason an employee may be teleworking or on leave if the reason is COVID-19.

 

While many questions remain, these updated FAQs provide some helpful guidance for employers. Of course, certain situations can present additional issues for employers to consider. And, state and local law also may modify the employer’s analysis for those jurisdictions. Employers need to keep up to date and should consult experienced counsel when navigating these issues.

IT Inventory & Asset Management | Device42 SoftwareLast week, in its Cybersecurity Summer Newsletter, the Office of Civil Rights (OCR) published best practices for creating an IT asset inventory list to assist healthcare providers and business associates in understanding where electronic protected health information (ePHI) is located within their organization, and improve HIPAA Security Rule compliance.  OCR investigations often find that organizations “lack sufficient understanding” of where all of their ePHI is located, and while the creation of an IT asset inventory list is not required under the HIPAA Security Rule, it could be helpful in the development of a risk analysis, and in turn and implementing appropriate safeguards – which are HIPAA Security Rule requirements. Essentially, if an organization doesn’t know what IT assets it has or where its ePHI is, how can it effectively assess the risks associated with those assets and information and protect them?

The lack of an inventory, or an inventory lacking sufficient information, can lead to gaps in an organization’s recognition and mitigation of risks to the organization’s ePHI.  Having a complete understanding of one’s environment is key to minimizing these gaps and may help ensure that a risk analysis is accurate and thorough, as required by the Security Rule.

In general, an organization’s IT asset inventory list consists of “IT assets with corresponding descriptive information, such as data regarding identification of the asset (e.g., vendor, asset type, asset name/number), version of the asset (e.g., application or OS version), and asset assignment (e.g., person accountable for the asset, location of the asset.”

The OCR Newsletter suggests including the follow types of assets in an organization’s IT asset inventory list:

  • Hardware assets that comprise physical elements, including electronic devices and media, which make up an organization’s networks and systems. This can include mobile devices, servers, peripherals, workstations, removable media, firewalls, and routers.
  • Software assets that are programs and applications which run on an organization’s electronic devices. Well-known software assets include anti-malware tools, operating systems, databases, email, administrative and financial records systems, and electronic medical/health record systems. Though lesser known, there are other programs important to IT operations and security such as backup solutions, virtual machine managers/hypervisors, and other administrative tools that should be included in an organization’s inventory.
  • Data assets that include ePHI that an organization creates, receives, maintains, or transmits on its network, electronic devices, and media. How ePHI is used and flows through an organization is important to consider as an organization conducts its risk analysis.

In addition, the OCR Newsletter recommends the inclusion of IT assets that don’t necessarily store or process ePHI, but that still may lead to a security incident, such as Internet of Things (IoT) or other smart devices.  For example, a recent study by Quocirca, a security research firm, found that approximately 60 % of businesses in in the U.S., U.K., France and Germany have suffered a IoT printer related data breach in 2019, with the average breach costing an organization approximately $400,000.

The OCR Newsletter also provides other cybersecurity-related and HIPAA compliance benefits an IT asset inventory list can provide, beyond the risk analysis. For example, HIPAA requires that covered entities and business associates “[i]mplement policies and procedures that govern the receipt and removal of hardware and electronic media that contain [ePHI] into and out of a facility, and the movement of these items within the facility”, which will be more efficient if the organization has IT asset inventory list that has location/owner/assignment information in place.  Moreover, an IT asset inventory list can aid an organization in identifying and tracking devices to ensure timely updates, patches and password changes.

HIPAA compliance is no doubt a significant challenge for large and small covered healthcare providers, and other covered entities and business associates, and data breaches are almost inevitable. Preparation of a comprehensive IT asset inventory, while not required, can go a long way in both ensuring HIPAA compliance, and preventing a security incident.  Below are some additional basic compliance measures:

  • Provide training and improve security awareness for workforce members when they begin working for the organization and periodically thereafter.
  • Maintain written policies and procedures that address required administrative, physical, and technical safeguards required under the Security Rule.
  • Maintain business associate agreements with all business associates.
  • Document compliance efforts.
  • Maintain and practice an incident response plan in the event of a data breach.

The National Security Agency (NSA) recently released helpful guidance on how to effectively limit location data exposure for its staffers, which also can be helpful information for the general public. Businesses likely will have different perspectives about location data than the NSA, which is trying to protect its staffers and its vital national security missions. For business, they may want to have location data about their consumers and workforce members for several reasons, but some may not realize they are even collecting this data. As laws such as the California Consumer Privacy Act (CCPA) begin to become more widespread in the U.S., business will need to be more deliberate and aware of the data they are collecting.

The NSA guidance provides an outline of categories of mobile device geolocation services and recommendations on how to prevent exposure of sensitive location information and limit the amount of location data shared. The NSA also recommends pairing its guidance with an earlier Cybersecurity & Infrastructure Security Agency (CISA) security tip on privacy and mobile device apps.

As many businesses think about the categories of personal information they collect from consumers, members of their workforce, others, geolocation may be the last thing that comes to mind. However, businesses are increasingly deploying apps, mobile phones, and devices to further their business needs. Consider the response to the COVID-19 pandemic, as many businesses have obtained or developed various devices and apps enabling them to more efficiently screen employees and consumers for coronavirus symptoms and to maintain social distancing. The data collected by the use of those technologies may not be apparent; businesses may be more focused on quickly meeting CDC and state guidance. More traditionally, businesses  might provide their workforce members company-owned iPhones, fitbits for a wellness programs, tablets to interact with consumers, or some other smart device or app, while not realizing all of its capabilities or configurations.

“A cell phone begins exposing location data the second it is powered on because it inherently trusts cellular networks and providers. Devices’ location data, from GPS to Wi-Fi or Bluetooth connections, may be acquired by others with or without the user or provider’s consent,” states the NSA. “Anything that sends and receives wireless signals has location risks similar to phones, including Internet of Things (IoT) devices, vehicles and many products with “smart” included in the name.”

In virtually all cases, the NSA will have different considerations for collecting and managing location data. For businesses, such information can be helpful to serve legitimate business needs. However, under the CCPA, for example, businesses need to provide “consumers” (which currently includes employees and applicants residing in California) with a notice at collection. This notice must explain the categories of personal information that the business collects, and one of those categories is geolocation data. The notice also must explain the purposes that such data will be used by the business. As businesses work through the process of rolling out new technologies, therefore, they’ll need to consider the scope of data collection, even if they are not interested in the data capable of being collected. If a business determines location data is not needed, the NSA guidance can be helpful as it provides mitigation tips to help limit the collection of same:

  • Disable location service settings on the device.
  • Disable radios when they are not actively in use: disable Bluetooth and turn off Wi-Fi if these capabilities are not needed.
  • Use Airplane Mode when the device is not in use.
  • Apps should be given as few permissions as possible (e.g. set privacy settings to ensure apps are not using or sharing location data).
  • Turn off settings (typically known as FindMy or Find My Device settings) that allow a lost, stolen, or misplaced device to be tracked.
  • Set browser privacy/permission location settings to not allow location data usage.
  • Use an anonymizing Virtual Private Network (VPN) to help obscure location.
  • Minimize the amount of data with location information that is stored in the cloud, if possible.

Of course, there are situations where the business will want to have location data collected, such on company-provided devices with Find My Device capabilities that allow lost, stolen, or misplaced devices to be located.

As many who have gone through compliance with the General Data Protection Regulation in the European Union, the CCPA and other laws that may come after it in the U.S. will require businesses to think more carefully about the personal information they collect, including location data. The NSA guidance is a helpful step in thinking about steps the business can take to apply best practices to its collection of location data.

 

 

On July 21, 2020, the New York Department of Financial Services (“DFS”) filed its first enforcement action under New York’s Cybersecurity Requirements for Financial Services Companies, 23 N.Y.C.R.R. Part 500 (“Reg 500”).    Reg 500, which took effect in March 2017, imposes wide-ranging and rigorous requirements on subject organizations and their service providers, which are summarized here.

According to the Statement of Charges, First American Title Insurance Co. (“First American”) failed to remediate a vulnerability on its public-facing website, thereby exposing millions of documents containing sensitive consumer information – including bank account numbers, mortgage and tax records, social security numbers, wire transaction receipts, and drivers’ license images – to unauthorized access.  More specifically, DFS claims that First American failed to:

  • Conduct a security review and risk assessment of the vulnerability – steps that were mandated by the Company’s own cybersecurity policies;
  • Properly classify the level of risk associated with the website vulnerability;
  • Adequately investigate that vulnerability (the Company reviewed only a tiny fraction of the impacted documents and, as a result, severely underestimated the seriousness of the vulnerability); and
  • Heed the advice of the Company’s internal cybersecurity team, which advised that further investigatory actions were needed.

The foregoing failures, DFS contends, violated six provisions of Reg 500.  Specifically:

  1. 23 NYCRR 500.02: The requirement to maintain a cybersecurity program that is designed to protect the confidentiality, integrity and availability of the covered entity’s information systems, and which is based on the covered entity’s risk assessment.
  2. 23 NYCRR 500.03: The requirement to maintain a written policy or policies, approved by senior management, setting forth the covered entity’s policies and procedures for the protection of its information systems and the nonpublic personal information (“NPI”) stored on those systems.
  3. 23 NYCRR 500.07: The requirement to limit user access privileges to information systems that provide access to NPI and periodically review such access privileges.
  4. 23 NYCRR 500.09: The requirement to conduct a periodic risk assessment of the covered entity’s information systems to inform the design of its cybersecurity program.
  5. NYCRR 500.14(b): The requirement to provide regular cybersecurity awareness training for all personnel as part of the covered entity’s cybersecurity program, and to update such training to reflect risks identified by the covered entity in its risk assessment.
  6. NYCRR 500.15: The requirement to implement controls, including encryption, to protect NPI held or transmitted by the covered entity both in transit, over external networks, and at rest.

The case against First American is scheduled to proceed to an administrative hearing on October 26, 2020.  DFS is seeking civil penalties, along with an order requiring the Company to remedy its violations of Reg 500.  Each violation of Reg 500 carries a potential penalty of up to $1,000 and DFS is taking the position that each instance where NPI was subject to unauthorized access constituted a separate violation.  DFS alleges that hundreds of millions of documents were exposed to potential unauthorized access as a result of First American’s alleged violations and that, according to the Company’s own analysis, more than 350,000 documents were accessed without authorization as a result of the Company’s website vulnerability.  If DFS’s position on what constitutes a single violation prevails, First American could be exposed to hundreds of millions of dollars in civil penalties.

The case against First American may signal that DFS, after giving covered organizations several years to get their compliance programs in order, now intends to aggressively enforce Reg 500’s requirements.  To prepare for this eventuality, subject organizations need to closely scrutinize their compliance programs – including their policies and procedures for conducting security reviews and risk assessments, and for investigating and responding to security incidents – and take proactive steps to plug any gaps in those programs.  We have prepared several articles, blog posts, and webinars to help organizations determine what Reg 500 requires and to assess their compliance with those requirements:

Whether it is facial recognition technology being used in connection with COVID-19 screening tools and in law enforcement, continued use of fingerprint-based time management systems, or the use of various biometric identifiers for physical security and access management, applications involving biometric identifiers and information in the public and private sectors continue to grow. Concerns about the privacy and security of that information continue to grow as well. Several states have laws protecting biometric information in one form or another, chief among them Illinois, but the desire for federal legislation remains.

Modeled after Illinois’s Biometric Information Privacy (BIPA), the National Biometric Information Privacy Act (Act), proposed by Sens. Jeff Merkley and Bernie Sanders, contains three key provisions:

  • A requirement to obtain consent from individual prior to collecting and disclosing their biometric identifiers and information.
  • A private right of action against entities covered by the Act that violate its protections which entitles aggrieved individuals to recover, among other things, the greater of (i) $1,000 in liquidated damages or (ii) actual damages, for negligent violations of the protections granted under the law.
  • An obligation to safeguard biometric identifier or biometric information in a manner similar to how the organization safeguards other confidential and sensitive information, such as Social Security numbers.

The Act would apply to “private entities,” generally including a business of any size in possession of biometric identifiers or biometric information of any individual. Federal, state, and local government agencies and academic institutions are excluded from the Act.

Under the Act, private entities would be required to:

  • Develop and make available to the public a written policy establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information. That schedule may not extend one year beyond an individual’s last interaction with the entity, but destruction could be required earlier;
  • Collect biometric identifiers or biometric information only when needed to provide a service to the individuals or have another valid business reason;
  • Inform individuals their biometric identifiers or biometric information is being collected or stored, along with the purpose and length of the collection, storage, or use, and must receive a written release from individuals which may not be combined with other consents, including an employment agreement;
  • Obtain a written release immediately prior to the disclosure of any biometric identifier or biometric information that includes the data to be disclosed, the reason for the disclosure, and the recipients of the data; and
  • Maintain the information using a reasonable standard of care.

Readers familiar with the BIPA in Illinois will find these requirements familiar. Readers familiar with the California Consumer Privacy Act (CCPA) will find the following “Right to Know” familiar as well. The Act would grant individuals the right to request certain information about biometric identifiers or biometric information collected by a covered entity within the preceding 12-month period. This information includes “specific pieces of personal information” and “the categories of third parties with whom the business shares the personal information.” The Act uses “personal information” but does not define it, leaving it unclear if it pertains only to biometric identifiers and biometric information.

Most troubling is the private right of action provision referenced above. The Act uses language similar to the language in the BIPA, which has led to a flood of class action litigation, including a decision by the IL Supreme Court finding plaintiffs need not show actual harm to recover under the law. The legislative process likely will result in some modification to the law, assuming it even survives, a fate privacy laws tend to have at the federal level. Nonetheless, we will continue to monitor the track of this and similar laws.

Despite several attempts, Congress has struggled to push forward a federal consumer privacy law over the past few years. But the COVID-19 pandemic, which has raised concerns regarding location monitoring, GPS tracking and use of health data, has heightened the urgency for federal consumer privacy legislation. In May, a group of Democrats from the U.S. Senate and House of Representatives introduced the Public Health Emergency Privacy Act (“the Act”), aimed to protect health information during the pandemic and regulate the use of that data with contact tracing technologies.

In late July, the Senate Committee of Appropriations introduced an Emergency Coronavirus Stimulus Package (“the Stimulus Package”) which would allocate $53 million of the $306 million package, to the Department of Homeland Security Cybersecurity and Infrastructure Security Agency for the protection of Coronavirus research data and related data. In addition, a group of 13 senators including Kamala Harris, D-California, Elizabeth Warren, D-Massachusetts, and Mark Warner, D-Virginia, sent a letter to Senate and Congressional leadership, asking for the Act to be included in the passage of the Stimulus Package.

“Health data is among the most sensitive data imaginable and even before this public health emergency, there has been increasing bipartisan concern with gaps in our nation’s health privacy laws,” the Senators stated in their letter.

“While a comprehensive update of health privacy protections is unrealistic at this time, targeted reforms to protect health data – particularly with clear evidence that a lack of privacy protections has inhibited public participation in screening activities – is both appropriate and necessary,” they added.

Under the Act, “Covered Organizations” is defined as “any person that collects, uses, or discloses  emergency health data electronically or  through communication by wire or radio; OR that develops or operates a website, web application, mobile application, mobile operating system feature, or smart device application for the purpose of tracking, screening, monitoring, contact tracing, or mitigation, or otherwise responding to the COVID–19 public health emergency.” NOTE:  Covered Organizations do not include: a health care provider; a person engaged in a de minimis collection or processing of emergency health data; a service provider; a person acting in their individual or household capacity; or a public health authority.

The Act would protect “emergency health data” which means “data linked or reasonably linkable to an individual or device, including data inferred or derived about the individual or device from other collected data provided such data is still linked or reasonably linkable to the individual or device, that concerns the public COVID–19 health emergency.” Examples of such data include:

  • information that reveals the past, present, or future physical or behavioral health or condition of, or provision of healthcare to, an individual, including data derived from testing an individual. This likely would include COVID-19 viral or serological test results, along with genetic data, biological samples, and biometrics;
  • other data collected in conjunction with other emergency health data or for the purpose of tracking, screening, monitoring, contact tracing, or mitigation, or otherwise responding to the COVID–19 public health emergency, such as (i) geolocation and similar information for determining the past or present precise physical location of an individual at a specific point in time, (ii) proximity data that identifies or estimates the past or present physical proximity of one individual or device to another, including information derived from Bluetooth, audio signatures, nearby wireless networks, and near-field communications; and (iii) any other data collected from a personal device.

Below are key requirements of the Act for Covered Organizations:

  • Only collect, use or disclose data that is necessary, proportionate and limited for a good-faith health purpose;
  • Take reasonable measures, where possible, to ensure the accuracy of data and provide a mechanism for individuals to correct inaccuracies;
  • Adopt reasonable safeguards to prevent unlawful discrimination on the basis of emergency health data;
  • Only disclose data to a government entity if it is to a public health authority and is solely for good faith public health purposes;
  • Establish and implement reasonable data security policies, practices and procedures;
  • Obtain affirmative express consent before collecting, using or disclosing emergency health data, and provide individuals with an effective mechanism to revoke that consent. NOTE: There are limited exceptions where consent is not required including to protect from fraud/malicious activity, to prevent a security incident, or if otherwise required by law;
  • Provide notice in the form of a privacy policy prior to collection that describes how and for what purposes the data will be used (including categories of recipients), the organization’s data security policies and practices, and how individuals may exercise their rights.

If enacted, the Federal Trade Commission (FTC) would be required to promulgate rules regarding data collection, use and disclosure under the Act. In addition, both the FTC and state attorneys general would have enforcement authority over the Act.

The Act, if passed, would be a temporary measure that would terminate once COVID-19 was no longer deemed a public emergency. Covered organizations would be required to not use or maintain emergency health data 60 days after the termination of the public health emergency, and destroy or render not linkable such data.

With no comprehensive Federal privacy framework in place, the Senators are urging Congressional leadership to allow for a measure that provides “Americans with assurance that their sensitive health data will not be misused will give Americans more confidence to participate in COVID screening efforts, strengthening our common mission in containing and eradicating COVID-19”.

We will continue to update on the status of the Act and other related developments.

Businesses are now prohibited from transferring employee personal data from the European Economic Area (EEA) to the U.S. under the EU-U.S. Privacy Shield program. The Court of Justice of the European Union (CJEU) declared the EU-U.S. Privacy Shield invalid in Data Protection Commissioner v. Facebook Ireland and Schrems (C-311/18) (Schrems II), effective immediately. Businesses that relied on the EU-U.S. Privacy Shield as an adequate transfer mechanism can no longer perform routine activities such as sending employee data from the EEA to U.S. headquarters for HR administration, accessing a global HR database from the U.S., remotely accessing EEA user accounts from the U.S. for IT services, providing EEA data to third party vendors for processing in the U.S., or relying on certain cloud-based services.

The EU-U.S. Privacy Shield program was designed to provide EEA data with a level of protection comparable to EU law upon transfer to the U.S. The CJEU invalidated the program stating that U.S. companies could not provide an essentially equivalent level of protection based on the breadth of U.S. national security surveillance laws, FISA 702, E.O. 12.333, and PPD 28.

U.S. businesses must now identify an alternate mechanism to transfer employee data from the EEA to the U.S. Many businesses rely on transfer mechanisms such as binding corporate rules (BCRs) for intragroup transfers, or standard contractual clauses (SCCs) for intracompany transfers as well as transfers to third parties. SCCs are clauses approved by the EU as providing reasonable safeguards to data transferred from the EEA. The CJEU did not invalidate either of these transfer mechanisms in Schrems II but placed SCCs under heightened scrutiny. The Court emphasized the data exporter’s obligation to verify the data importer’s ability to provide EEA data with an adequate level of protection. The data exporter must review each transfer to determine on a case by case basis whether the SCCs provide sufficient reasonable safeguards, particularly in light of the recipient country’s surveillance laws. As a result, data exporters must review applicable local legislation for each transfer to identify when SCCs are adequate, whether supplemental protective measures are required, or whether the transfer cannot occur. A comparable analysis will apply to BCRs.

Businesses seeking to find an alternate to the EU-U.S. Privacy Shield, BCRs, or SCCs should review whether a transfer may fall under one of several exceptions to the GDPR’s requirement of an adequate transfer mechanism. Many of these exceptions, however, apply only when the transfer is necessary, occasional, and affects a limited number of data subjects.

Under the GDPR, an impermissible transfer can result in assessment of fines up to €20,000,000, or, in the case of an undertaking, up to four percent of the total worldwide annual turnover of the preceding financial year, whichever is higher. In addition, EEA data subjects may bring a private cause of action against the data exporter for an illegal transfer, either individually or as part of a class action.

The CJEU’s decision creates great uncertainty about the future of transatlantic data transfers. As the EU and U.S. negotiate the path forward, U.S. businesses should review their employee data flows, identify whether they or their sub-contractors are subject to U.S. national security laws, and determine the feasibility of additional contractual or technical measures to supplement the reasonable safeguards.

Please see our full article (hereand FAQs (here) for additional information.

Roger Severino, Director of the Office for Civil Rights (OCR) at the U.S. Department of Health and Human Services (HHS), provides advice for HIPAA covered health care providers:

When informed of potential HIPAA violations, providers owe it to their patients to quickly address problem areas to safeguard individuals’ health information

According to OCR allegations, a small health care provider in North Carolina, Metropolitan Community Health Services, reported a data breach on June 9, 2011. The breach involved the impermissible disclosure of protected health information to an unknown email account affecting 1,263 patients. It is not clear when OCR’s investigation commenced, but it “revealed longstanding, systemic noncompliance with the HIPAA Security Rule…Metro failed to conduct any risk analyses, failed to implement any HIPAA Security Rule policies and procedures, and neglected to provide workforce members with security awareness training until 2016.” Under the Resolution Agreement reached with OCR, Metro agreed to a two-year corrective action plan (CAP) and to pay $25,000.

The OCR considered that Metro is a Federally Qualified Health Center that provides a variety of discounted medical services to the underserved population in rural North Carolina, but that did not stop it from taking enforcement action against a relatively small covered entity. Other examples of enforcement actions against small health care providers include:

HIPAA compliance is no doubt a significant challenge for large and small covered healthcare providers, and other covered entities and business associates. In addition, data breaches can be nearly impossible to prevent in all cases. However, these and other OCR enforcement actions suggest that with some relatively basic compliance measures, small providers can be more successful during OCR investigations. Here are some examples:

  • Conduct a risk assessment that considers the threats and vulnerabilities to protected health information.
  • Maintain written policies and procedures that address required administrative, physical, and technical safeguards required under the Security Rule.
  • Provide training and improve security awareness for workforce members when they begin working for the organization and periodically thereafter.
  • Maintain business associate agreements with all business associates.
  • Document compliance efforts.

And, of course, evaluate compliance following a reported data breach and make the necessary improvements.

On July 16, 2020, the Court of Justice of the European Union (CJEU) published its decision in the matter of Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems (“Schrems II”). The matter, arising from the transfer of Schrems’ personal data by Facebook Ireland to Facebook Inc. in the United States, presented questions concerning the transfer of personal data from the EEA to a third country without an adequacy determination. The decision declares the EU-US Privacy Shield program invalid and affirms the validity of standard contractual clauses (SCCs) as an adequate mechanism for transferring personal data from the EEA, subject to heightened scrutiny.

The CJEU invalidated the Privacy Shield program on grounds that it fails to provide an adequate level of protection to personal data transferred from the EEA to the U.S. In support, it points specifically to three U.S. national security laws: FISA 702, E.O. 12.333, and PPD 28. The CJEU found the breadth of these bulk surveillance and monitoring laws violates the basic minimum safeguards required by the GDPR for proportionality: the U.S. government’s processing of EEA personal data is not limited to what is strictly necessary. The CJEU further noted these surveillance programs fail to provide EEA data subjects with enforceable rights and effective legal review comparable to applicable EU law. As of the date of the decision, data exporters and U.S. data importers can no longer rely on EU-US Privacy Shield certification as an adequate mechanism to transfer personal data from the EEA to the U.S. There is currently no grace period. However, since a grace period was enacted shortly after the EU-US Safe Harbor was invalidated, it is conceivable one will be announced as the EU and U.S. assess the implications of this decision.

The CJEU affirmed the validity of controller-processor standard contractual clauses (SCCs) as an adequate mechanism for transferring personal data from the EEA to a third country lacking an EU adequacy decision. In affirming the validity of SCCs, the CJEU highlighted three stakeholder obligations:

  1. the data exporter’s responsibility to verify the importer’s ability to provide an essentially equivalent level of protection in the third country;
  2. the data importer’s responsibility to notify the exporter immediately if it cannot comply with the SCCs, including situations where it is compelled to produce EEA data at the request of law enforcement; and
  3. the data exporter’s responsibility to immediately suspend or terminate the transfer upon notice from the importer that it cannot comply with the SCCs.

Based on these requirements, the SCCs may not be an adequate transfer mechanism in every case, or may require the negotiation of additional provisions to satisfy these obligations.

The CJEU further highlighted the affirmative obligation of supervisory authorities to identify and suspend or terminate transfers based on SCCs where the importer cannot provide EEA data with an adequate level of protection.

Under the GDPR, an impermissible transfer can result in fines up to €20,000,000, or in the case of an undertaking, up to 4 % of the total worldwide annual turnover of the preceding financial year, whichever is higher. In addition, EEA data subjects may bring a private cause of action for an illegal transfer, either individually or as part of a class action.

Please see our full article (hereand FAQs (here) for additional information.

Back in October of 2019, the U.S. Supreme Court was petitioned to review a Ninth Circuit ruling regarding the Telephone Consumer Privacy Act (“TCPA”) on the following issues: 1) whether the TCPA’s prohibition on calls made by an automatic telephone dialing system (“ATDS”) is an unconstitutional restriction of speech, and if so whether the proper remedy is to broaden the prohibition to abridge more speech, and 2) whether the definition of ATDS in the TCPA encompasses any device that can “store” and “automatically dial” telephone numbers, even if the device does not “us[e] a random or sequential number generator.” Now, the Court has finally accepted writ of certiorari, limited to review of Question 2, described above.

ATDS Circuit Split

When the TCPA was enacted in 1991, most American consumers were using landline phones, and Congress could not begin to contemplate the evolution of the mobile phone. The TCPA defines “Automatic Telephone Dialing System” (ATDS) as “equipment which has the capacity—(A) to store or produce telephone numbers to be called, using a random or sequential number generator; and (B) to dial such numbers.” 47 U.S.C § 227(a)(1). In 2015, the Federal Communications Commission (FCC) issued its 2015 Declaratory Ruling & Order (2015 Order), concerning clarifications on the TCPA for the mobile era, including the definition of ATDS and what devices qualify. The 2015 Order only complicated matters further, providing an expansive interpretation for what constitutes an ATDS, and sparking a surge of TCPA lawsuits in recent years.

Consequently, several FCC-regulated entities appealed the 2015 FCC Order to the D.C. Circuit Court of Appeals, in ACA International v. FCC, No. 15-1211, Doc. No. 1722606 (D.C. Cir. Mar. 16, 2018). The D.C. Court concluded the FCC’s opinion that all equipment that has the potential capacity for autodialing is subject to the TCPA, is too broad. Although the FCC did say in its 2015 Order “there must be more than a theoretical potential that the equipment could be modified to satisfy the ‘autodialer’ definition”, the Court held that this “ostensible limitation affords no ground for distinguishing between a smartphone and a Firefox browser”. The Court determined that the FCC’s interpretation of ATDS was “an unreasonably expansive interpretation of the statute”.

Since the decision in ACA Int’l, courts have weighed in on the D.C. Circuit Court ruling and the status of the 2015 Order, sparking a circuit split over what constitutes an ATDS. The Second and Ninth Circuit have both broadly interpreted the definition of an ATDS, while the Third, Seventh and Eleventh have taken a much narrower reading. For example, earlier this year the Eleventh and Seventh Circuit Courts reached similar conclusions, back-to-back, narrowly holding that the TCPA’s definition of Automatic Telephone Dialing System (ATDS) only includes equipment that is capable of storing or producing numbers using a “random or sequential” number generator, excluding most “smartphone age” dialers. By contrast, the Ninth Circuit has concluded that “an ATDS need not be able to use a random or sequential generator to store numbers[.]”  The court explained that “it suffices to merely have the capacity to ‘store numbers to be called’ and ‘to dial such numbers automatically.’”

Supreme Court Petition

The Supreme Court has accepted petition for review of the Ninth Circuit ruling on the issue of whether the definition of “ATDS” in the TCPA encompasses any device that can “store” and “automatically dial” telephone numbers, even if the device does not “us[e] a random or sequential number generator.” The Supreme Court’s decision should help resolve the circuit split and provide greater clarity and certainty for parties facing TCPA class action litigation. The Court is expected to hear oral arguments on this dispute at the start next term, in the fall, and issue a decision by the summer of 2021.

Take Away

2020 is shaping up to be an important year for the TCPA. We recently reported on a much-anticipated Supreme Court decision, Barr v. American Association of Political Consultants, in which the court weighed in on the constitutionality of the TCPA, holding that the government debt collection exception of the TCPA violated the First Amendment, and must be invalidated and severed from the remainder of the statute. While it appears that courts are generally leaning towards the narrowing of the TCPA in a myriad of aspects, organizations are still advised to err on the side of caution, during this period of uncertainty, when implementing and updating telemarketing and/or automatic dialing practices.