Massachusetts Attorney General Creates Data Privacy and Security Division

The Massachusetts Office of the Attorney General has created a new Data Privacy and Security Division. This Division is charged with protecting consumers from the threats to the privacy and security of their data. The Attorney General, Maura Healey, announced “The Data Privacy and Security Division will build on our office’s commitment to empowering Massachusetts consumers in the digital economy, ensuring that companies are protecting personal data, and promoting equal and open access to the internet.”

Attorney General Healey announced that the Data Privacy and Security Division will “investigate and enforce the Massachusetts Consumer Protection Act and Data Breach Law to protect the security and privacy of consumers’ data.” This new Data Privacy and Security Division is the latest development in increasing efforts by Massachusetts officials to address cybersecurity concerns. In the Fall of 2019, Massachusetts Governor Charlie Baker introduced an expansive cybersecurity program, including statewide workshops for municipalities to work together to enhance their cybersecurity capabilities.

Notably, last Spring, Massachusetts updated its data breach notification law with changes that are likely to create opportunities for enforcement by the division. In particular, the new law expanded the content requirements for notifications to the Attorney General and Office of Consumer Affairs and Business Regulation (OCABR) to include, among other things, whether the business that experienced the breach maintains a written information security program (WISP) and whether they have updated the WISP.  Employers maintaining personal information of Massachusetts residents should revisit their incident response plan (or develop one).

Employers operating in Massachusetts or holding data on Massachusetts residents should be aware of the focus that Governor Baker and Attorney General Healey have placed on cybersecurity. These Massachusetts programs highlight the importance of conducting risk assessments to identify and address potential vulnerabilities to hackers as well as security risks created by employees and contractors.

OCR is Serious About Patients’ Rights to Access Records, Announcing Enforcement Actions Against 5 Providers

HIPAA: Second Settlement this Year Related to Right to Access Initiative | Blogs | Health Care Law Today | Foley & Lardner LLPWhen providers, health plans, business associates, and even patients and plan participants think of the HIPAA privacy and security rules (‘HIPAA Rules”), they seem to be more focused on the privacy and security aspects of the HIPAA Rules. That is, for example, safeguarding an individual’s protected health information (PHI) to avoid data breaches or avoiding improper disclosures to persons without authority for receiving same. An equally important aspect of the HIPAA Rules, however, is ensuring patient access to health records, as shown by recent enforcement activity announced yesterday by the Office for Civil Rights (OCR) at the U.S. Department of Health and Human Services (HHS).

Last year, OCR commenced its Right of Access Initiative, an enforcement priority in 2019 to support individuals’ right to timely access to their health records at a reasonable cost. At least one study found providers are struggling to fully comply with the right to access requirement under HIPAA, rights which also exist under state law. A study by medRxiv reported in HIPAAJournal highlights this issue. During the study, 51 providers were sent medical record access requests and the results showed:

More than half (51%) of the providers assessed were either not fully compliant with the HIPAA right of access or it too[k] several attempts and referrals to supervisors before requests were satisfied in a fully compliant manner…

The researchers also conducted a telephone survey on 3,003 healthcare providers and asked about policies and procedures for releasing patient medical records. The researchers suggest as many as 56% of healthcare providers may not be fully compliant with the HIPAA right of access. 24% did not appear to be fully aware of the fee limitations for providing copies of medical records.

What is the right to access under HIPAA?

The HIPAA Privacy Rule generally requires HIPAA covered entities (health plans and most health care providers) to provide individuals, upon request, with access to PHI about them in one or more “designated record sets” maintained by or for the covered entity. This includes the right to inspect or obtain a copy, or both, as well as to direct the covered entity to transmit a copy to a designated person or entity of the individual’s choice. This right applies for as long as the covered entity (or its business associate) maintains the information, regardless of the date the information was created, and whether the information is maintained in paper or electronic systems onsite, remotely, or is archived.

When implementing this rule, covered entities and their business associates have several issues to consider, such as:

  • What information is subject to the right and what information is not, such as psychotherapy notes.
  • Confirming the authority of “personal representative” to act on behalf of an individual.
  • Procedures for receiving and responding to requests – such as written request requirements, verifying the authority of requesting parties, timeliness of response, whether and on what grounds requests may be denied, and fees that can be charged for approved requests.

To assist covered entities (and business associates), the OCR provides a summary of right of access issues, as well as a set of frequently asked questions.

Enforcement of the Right to Access.

The five enforcement actions announced yesterday are not the first enforcement actions taken by OCR. In September 2019, the OCR settled a compliant with a provider for $85,000 after it alleged the provider failed to respond to a patient’s request for access. In December 2019, the OCR settled a second complaint, again for $85,000, to address similar allegations, failure to respond timely, as well as failing to forward the medical records in the requested format and charging more than the reasonably cost-based fees allowed under HIPAA.

The five more recent cases involve very similar allegations against mostly small health care providers, at least in one case a not-for-profit, namely, the failure to provide patients with the right to access their protected health information under the HIPAA Rules. The total amount of the settlements with these fine entities is $136,500.

Patients can’t take charge of their health care decisions, without timely access to their own medical information,” said OCR Director Roger Severino. “Today’s announcement is about empowering patients and holding health care providers accountable for failing to take their HIPAA obligations seriously enough,” Severino added.

Getting Compliant

Providers receive all kinds of requests for medical and other records in the course of running their businesses. Reviewing and responding to these requests no doubt creates administrative burdens. However, buying forms online might not get the practice all it needs, and could put the practice at additional risk if those are followed without considering state law or are not implemented properly.

Putting in place relatively simple policies, carefully developing template forms, assigning responsibility, training, and documenting responses can go a long way toward substantially minimizing the risk an OCR enforcement action and its severity. Providers also should be considering sanctions under state law that also might flow from failing to provide patients access to their records. It is worth nothing that in some cases state law may be more stringent than HIPAA concerning the right to access, requiring modifications to the processes practices follow for providing access.

Michigan Considers Enhanced Data Breach Notification Law

Privacy and security continue to be at the forefront for legislatures across the nation, despite (or perhaps because of) the COVID-19 pandemic.  In late May, with back-to-back amendments, Washington D.C. and Vermont significantly overhauled their data breach notification laws, including expansion of the definition of personal information, and heightened notice requirements.  Now, Michigan may follow suit.

Earlier this month, the Michigan House of Representatives voted to advance House Bills 4186-87, sponsored by state Rep. Diana Farrington, of Utica, which create the Data Breach Notification Act, and exempt entities subject to the new act from similar provisions of Michigan’s previous Identity Theft Protection Act. Unlike other states that have expanded on already existing data breach notification laws, this bill would effectively replace Michigan’s prior law in its entirety.

This proposal puts Michigan consumers first when there are instances of compromised data,” said Farrington, who chairs the House Financial Services Committee. “Consumer protections are always important – and now many people across Michigan and in Macomb County have been put in dire financial straits through no fault of their own due to COVID-19. They don’t need the additional stress that is brought on when your personal information is potentially in someone else’s hands.

Below are highlights of Michigan’s new data breach notification bill:

  • Expansion of the definition of “sensitive personally identifying information” (PII). Following many other states, the new bill expands the definition of PII to include a state resident’s first name or first initial and last name in combination with one or more of the following data elements that relate to the resident:
    • A nontruncated  Social  Security  number,  driver  license  number,  state  personal identification  card  number,  passport  number,  military  identification  number,  or other unique identification number issued on a government document.
    • A financial account number.
    • A  medical  or  mental  history,  treatment,  or  diagnosis  issued  by  a  health  care professional.
    • A  health  insurance  policy  number  or  subscriber  identification  number  and  any unique identifier used by a health insurer.
    • A username or email address, in combination with a password or a security question and answer, that would allow access to an online account that is likely to have or is used to obtain sensitive personally identifying information.
  • Notification requirements to affected state residents. A covered entity would be required to provide notice to state residents whose PII was acquired in the breach, as expeditiously as possible and without unreasonable delay, taking into account the time necessary to conduct an investigation, and determine scope of breach, but not more than 45 days of its determination that a breach has occurred (unless law enforcement determines that such notification could interfere with a criminal investigation/national security). Written notice must at least include the following:
    • The date, estimated date, or estimated date range of the breach.
    • A description  of  the  PII acquired as part of the breach.
    • A   general   description   of   the   actions   taken   to   restore   the   security   and confidentiality of the PII involved in the breach.
    • A general description of steps a state resident can take to protect against identity theft, if the breach creates a risk of identity theft.
    • Contact information that the state resident can use to ask about the breach.
  • Notification requirements to state agency. If the number of state residents to be notified exceeds 750, the entity would have to provide written notice to Michigan’s Department of Technology, Management & Budget within the same time frame as notification to affected residents. Written notice must at least include a synopsis of events surrounding the breach, approximate number of state residents notified, any related services the covered entity is offering to state residents, and how the state resident can obtain additional information.
  • Substitute Notice. Under the bill, a covered entity required to provide notice could instead provide substitute notice, if direct notice is not feasible due to excessive cost or lack of sufficient contact information. For example, the cost of direct notification would be considered excessive if it exceeded $250,000.
  • Reasonable Security Measures. Michigan would join many other states that mandate businesses implement and maintain reasonable security measures designed to protect PII against a breach. When developing security measures, entities may consider the size of their entity, the amount of PII owned or licensed and its surrounding activity, and the cost to maintain such measures relative to the entity’s resources.
  • Data Disposal. Covered entities and third-party agents would be required to take reasonable measures to dispose of or arrange to dispose of PII when retention is no longer required by law. Disposal requires shredding, erasing or otherwise modifying PII to make it unreadable or undecipherable.
  • Penalties. The new law in its current form would not create a private right of action. However, a person that knowingly violates a notification requirement could be ordered to pay a fine of up to $2,000 for each violation or not more than $5,000 per day for each consecutive day the covered entity fails to take reasonable action to comply with the requirements, up to $250,000. The attorney general would have exclusive enforcement authority.

The bill now moves on to the Michigan Senate for further consideration. This amendment would keep Michigan in line with other states across the nation currently enhancing their data breach notification laws in light of the significant uptick in number and scale of data breaches and heightened public awareness.  Organizations across the United States should be evaluating and enhancing their data breach prevention and response capabilities.

City of Portland Bans Private Entities From Using Facial Recognition Technologies

Facial-recognition tech creates service, security options | Hotel ManagementThe City of Portland, Oregon becomes the first city in the United States to ban the use of facial recognition technologies in the private sector citing, among other things, a lack of standards for the technology and wide ranges in accuracy and error rates that differ by race and gender. Failure to comply can be painful. Similar to the remedy available under the Illinois Biometric Information Privacy Act, fueling hundreds of class action lawsuits, the Ordinance provides persons injured by a material violation a cause of action for damages or $1,000 per day for each day of violation, whichever is greater. The Ordinance is effective January 1, 2021.

Facial recognition technology has become more popular in recent years, including during the COVID-19 pandemic. As the need arose to screen persons entering a facility for symptoms of the virus, including temperature, thermal cameras, kiosks, and other devices with embedded with facial recognition capabilities were put into use. However, many have objected to the use of this technology in its current form, citing problems with the accuracy of the technology, as summarized in a June 9, 2020 New York Times article, “A Case for Banning Facial Recognition.”

Under the Ordinance, a “private entity” shall not use “face recognition technologies” in “places of public accommodation” in the boundaries of the City of Portland. Facial recognition technologies under the Ordinance means

an automated or semi-automated process that assists in identifying, verifying, detecting, or characterizing facial features of an individual or capturing information about an individual based on an individual’s face

Places of public accommodation include any place or service offering to the public accommodations, advantages, facilities, or privileges whether in the nature of goods, services, lodgings, amusements, transportation or otherwise. This covers just about any private business and organization. Note, Portland also passed a separate ordinance prohibiting the use of facial recognition technology by the city government.

There are some exceptions, however. Places of public accommodation do not include “an institution, bona fide club, private residence, or place of accommodation that is in its nature distinctly private.” It is not clear from the Ordinance what it means to be “distinctly private.” Also, the Ordinance does not apply:

  • When facial recognition technologies are necessary to comply with federal state or local law,
  • For user verification purposes by an individual to access the individual’s own personal or employer issued communication and electronic devices, or
  • To automatic face detection services in social media applications.

So, in Portland, employees can still let their faces get them into their phones, including their company-provided devices. But, businesses in Portland should evaluate whether they are using facial recognition technologies, whether they fall into one of the exceptions in the ordinance, and if not what alternatives they have for verification, security, and other purposes for which the technology was implemented.

California Assembly Passes CCPA Amendment: Employee Personal Information Exemption Extension

The California Consumer Privacy Act (“CCPA”) has only been in effect since January, but amendments are already on the horizon. Personal information in the employment context was highly contested during the CCPA’s amendment process prior to enactment and has continued to be a point of deliberation even after the CCPA’s effective date.

In its current version, the CCPA excludes certain employment-related personal information from most of the act’s requirements until January 1, 2021, leaving employees and applicants, in effect, as second-class “consumers.” The exclusion applies to personal information

“collected by a business about a natural person in the course of such person acting as a job applicant to or an employee, owner, director, officer, medical staff member, or contractor of that business, and to the extent the person’s personal information is collected and used by the business solely within the context of the natural person’s role or former role as a job applicant to or an employee, owner, director, officer, medical staff member, or contractor of that business.”

Thus, unlike consumers generally, employees and applicants currently may not request under the CCPA: the deletion of their personal information; the categories of personal information collected; the sources from which personal information is collected; the purpose for collecting or selling personal information; and the categories of third parties with whom the business shares their personal information.

It is important to highlight that under the current exemption, while employees are temporarily excluded from most of the CCPA’s protections, two areas of compliance remain: (i) providing a notice at collection, and (ii) maintaining reasonable safeguards for a subset of personal information driven by a private right of action now permissible for individuals affected by a data breach caused by a business’s failure to do so.

The California Assembly has now passed AB1281, which would extend the exemption until January 1, 2022. The bill moves on to Governor Gavin Newsom’s desk for signing. Notably, the operation of the extension is contingent upon voters not approving ballot proposition 24 in November, the California Privacy Rights Act (“CPRA”), which would amend the CCPA to include more expansive and stringent compliance obligations and inter alia, would extend the employment personal information exemption until January 1, 2023.

In light of AB1281 and/or the CPRA, it appears likely that the CCPA’s exemption for employee personal information will, at a minimum, be extended until 2022.  We will continue to update the status of this amendment, and other related CCPA developments as they unfold.

More EEOC COVID-19 Guidance: Testing, Screening, Managers, Confidentiality, and Telework

Since March of this year, the Equal Employment Opportunity Commission (EEOC) has released guidance on a near-monthly basis addressing various FAQs concerning COVID-19 issues. The guidance has focused on disability-related inquiries, confidentiality, hiring, and reasonable accommodations under the Americans with Disabilities Act (ADA), as well as issues under Title VII of the Civil Rights Act and the Age Discrimination in Employment Act (ADEA). In its latest FAQ update posted yesterday, the EEOC covers some more practical questions employers have on several COVID-19 issues, such as testing, telecommuting, and sharing employee medical information.

COVID-19 Testing

As COVID-19 testing capabilities and resources have expanded, many employers across the country have been working on establishing testing protocols. Some still have concerns, however, about whether they are permitted to test, particularly considering the general ADA requirement that any mandatory medical test of employees be “job related and consistent with business necessity.”

The EEOC has already confirmed that employers may opt to administer COVID-19 testing to employees before initially permitting them to enter the workplace.  In the updated FAQs, the EEOC further clarified that periodic testing to determine if the employees  presence in the workplace is permissible to determine if the employee poses a direct threat to others. In its updated FAQs, the EEOC also sought to address updates to CDC guidance. Specifically, the EEOC made clear that employers administering COVID-19 viral testing consistent with current CDC guidance will meet the ADA’s “business necessity” standard, and that following recommendations by the CDC or other public health authorities regarding whether, when, and for whom testing or other screening is appropriate. The EEOC acknowledged that the CDC and FDA may revise their recommendations based on new information, and reminded employers to stay up to date.

More on What Employers Can Ask Employees, and If Employees Refuse to Answer

For several months, employers have been building COVID-19 screening programs – taking employee temperatures and asking questions about COVID-19 symptoms and travel, among other things – before permitting employees to enter the employer’s facilities. Some employers have continued to wonder whether they are permitted under the ADA to ask employees whether they have had a COVID-19 test. The EEOC confirmed in the updated FAQs that employers may ask if employees have been tested for COVID-19. Presumably, this also means that employers may ask if the employee’s test was positive or negative, but this is not clear in the updated EEOC FAQs.

Because the permissibility of certain COVID-related requests are based on the existence of a direct threat, asking employees about COVID-19 testing does not extend to employees who are teleworking and not physically interacting with coworkers or others (for example, customers). Asking employees about COVID-19 testing also does not extend to whether the employee’s family members have COVID-19 or symptoms associated with COVID-19. This is because the Genetic Information Nondiscrimination Act (GINA) generally prohibits employers from asking employees medical questions about family members. But, the EEOC clarified employers may ask employees whether they have had contact with anyone diagnosed with COVID-19 or who may have symptoms associated with the disease.

The EEOC also further addressed whether employers may focus screening efforts on a single employee – e.g., asking only one employee COVID-19 screening questions. In this case, the employer must have a reasonable belief based on objective evidence that this person might have the disease, such as a display of COVID-19 symptoms. However, employees working regularly or occasionally onsite and who report feeling ill or who call in sick may be asked questions about their symptoms as part of workplace screening for COVID-19, according to the EEOC.

During the summer, several states began to implement mandatory and recommended quarantines for persons arriving in their states from other states with high levels of community spread. The EEOC confirmed that employers do not have to wait until employees experienced COVID-19 symptoms before they may ask employees where they traveled as such questions would not be disability-related inquiries.

As several employers have learned, not all employees cooperate with employer-administered screening programs. When they object, employers should consider their options carefully and whether an accommodation may be necessary. The EEOC acknowledges that the ADA allows employers to bar employees from physical presence in the workplace if they refuse to have their temperature taken or refuse to confirm whether they have COVID-19, symptoms associated with COVID-19, or have been tested for COVID-19. Some employers desire to make compliance with screening programs a condition of employment, subjecting employees to termination from employment if they fail to comply. The EEOC did not discuss that option, however, the agency reminded employers they can gain cooperation by asking employees the reasons for their refusal. They also can offer information and/or reassurance that they are taking steps to ensure workplace safety, that the steps are consistent with health screening recommendations from CDC, and that the employer is careful about maintaining confidentiality.

Managers Sharing Information About Employees with COVID

It is not uncommon for managers to learn about the medical condition of employees they supervise. Because the ADA requires all employee medical information to be maintained confidentially, managers who discover an employee has COVID-19 may be unsure about what they may and/or should do with that information. The EEOC FAQS make clear that managers may report this information to appropriate persons in the organization in order to then comply with public health authority guidance, such as contact tracing. Employers should consider directing managers on where to report this information in order to minimize who receives it, and what to report. However, the EEOC clarified that it would not violate the ADA if a worker reported to her manager the COVID-19 status of a coworker in the same workplace.

Recognizing that coworkers in small workplaces might be able to identify which worker(s) triggered contact tracing efforts, the EEOC reminds employers they still may not confirm or reveal the employee’s identity. For employees that have a need to know this information about other employees, they should be specifically instructed to maintain the confidentiality.

Telework

Many employees continue to telework, particularly in occupations where it is feasible to do so. Being away from the office, however, does not eliminate these COVID-19 issues. For example, managers still have to maintain the confidentiality of employee medical information when they are working from home. This includes, where necessary, taking steps to limit access to the information until the manager can return to the office to store the information according to normal protocols. It also includes not disclosing the reason an employee may be teleworking or on leave if the reason is COVID-19.

 

While many questions remain, these updated FAQs provide some helpful guidance for employers. Of course, certain situations can present additional issues for employers to consider. And, state and local law also may modify the employer’s analysis for those jurisdictions. Employers need to keep up to date and should consult experienced counsel when navigating these issues.

HIPAA Covered Entities and Business Associates Need an IT Asset Inventory List, OCR Recommends

IT Inventory & Asset Management | Device42 SoftwareLast week, in its Cybersecurity Summer Newsletter, the Office of Civil Rights (OCR) published best practices for creating an IT asset inventory list to assist healthcare providers and business associates in understanding where electronic protected health information (ePHI) is located within their organization, and improve HIPAA Security Rule compliance.  OCR investigations often find that organizations “lack sufficient understanding” of where all of their ePHI is located, and while the creation of an IT asset inventory list is not required under the HIPAA Security Rule, it could be helpful in the development of a risk analysis, and in turn and implementing appropriate safeguards – which are HIPAA Security Rule requirements. Essentially, if an organization doesn’t know what IT assets it has or where its ePHI is, how can it effectively assess the risks associated with those assets and information and protect them?

The lack of an inventory, or an inventory lacking sufficient information, can lead to gaps in an organization’s recognition and mitigation of risks to the organization’s ePHI.  Having a complete understanding of one’s environment is key to minimizing these gaps and may help ensure that a risk analysis is accurate and thorough, as required by the Security Rule.

In general, an organization’s IT asset inventory list consists of “IT assets with corresponding descriptive information, such as data regarding identification of the asset (e.g., vendor, asset type, asset name/number), version of the asset (e.g., application or OS version), and asset assignment (e.g., person accountable for the asset, location of the asset.”

The OCR Newsletter suggests including the follow types of assets in an organization’s IT asset inventory list:

  • Hardware assets that comprise physical elements, including electronic devices and media, which make up an organization’s networks and systems. This can include mobile devices, servers, peripherals, workstations, removable media, firewalls, and routers.
  • Software assets that are programs and applications which run on an organization’s electronic devices. Well-known software assets include anti-malware tools, operating systems, databases, email, administrative and financial records systems, and electronic medical/health record systems. Though lesser known, there are other programs important to IT operations and security such as backup solutions, virtual machine managers/hypervisors, and other administrative tools that should be included in an organization’s inventory.
  • Data assets that include ePHI that an organization creates, receives, maintains, or transmits on its network, electronic devices, and media. How ePHI is used and flows through an organization is important to consider as an organization conducts its risk analysis.

In addition, the OCR Newsletter recommends the inclusion of IT assets that don’t necessarily store or process ePHI, but that still may lead to a security incident, such as Internet of Things (IoT) or other smart devices.  For example, a recent study by Quocirca, a security research firm, found that approximately 60 % of businesses in in the U.S., U.K., France and Germany have suffered a IoT printer related data breach in 2019, with the average breach costing an organization approximately $400,000.

The OCR Newsletter also provides other cybersecurity-related and HIPAA compliance benefits an IT asset inventory list can provide, beyond the risk analysis. For example, HIPAA requires that covered entities and business associates “[i]mplement policies and procedures that govern the receipt and removal of hardware and electronic media that contain [ePHI] into and out of a facility, and the movement of these items within the facility”, which will be more efficient if the organization has IT asset inventory list that has location/owner/assignment information in place.  Moreover, an IT asset inventory list can aid an organization in identifying and tracking devices to ensure timely updates, patches and password changes.

HIPAA compliance is no doubt a significant challenge for large and small covered healthcare providers, and other covered entities and business associates, and data breaches are almost inevitable. Preparation of a comprehensive IT asset inventory, while not required, can go a long way in both ensuring HIPAA compliance, and preventing a security incident.  Below are some additional basic compliance measures:

  • Provide training and improve security awareness for workforce members when they begin working for the organization and periodically thereafter.
  • Maintain written policies and procedures that address required administrative, physical, and technical safeguards required under the Security Rule.
  • Maintain business associate agreements with all business associates.
  • Document compliance efforts.
  • Maintain and practice an incident response plan in the event of a data breach.

NSA Releases Helpful Guidance for Limiting Location Data Exposure

The National Security Agency (NSA) recently released helpful guidance on how to effectively limit location data exposure for its staffers, which also can be helpful information for the general public. Businesses likely will have different perspectives about location data than the NSA, which is trying to protect its staffers and its vital national security missions. For business, they may want to have location data about their consumers and workforce members for several reasons, but some may not realize they are even collecting this data. As laws such as the California Consumer Privacy Act (CCPA) begin to become more widespread in the U.S., business will need to be more deliberate and aware of the data they are collecting.

The NSA guidance provides an outline of categories of mobile device geolocation services and recommendations on how to prevent exposure of sensitive location information and limit the amount of location data shared. The NSA also recommends pairing its guidance with an earlier Cybersecurity & Infrastructure Security Agency (CISA) security tip on privacy and mobile device apps.

As many businesses think about the categories of personal information they collect from consumers, members of their workforce, others, geolocation may be the last thing that comes to mind. However, businesses are increasingly deploying apps, mobile phones, and devices to further their business needs. Consider the response to the COVID-19 pandemic, as many businesses have obtained or developed various devices and apps enabling them to more efficiently screen employees and consumers for coronavirus symptoms and to maintain social distancing. The data collected by the use of those technologies may not be apparent; businesses may be more focused on quickly meeting CDC and state guidance. More traditionally, businesses  might provide their workforce members company-owned iPhones, fitbits for a wellness programs, tablets to interact with consumers, or some other smart device or app, while not realizing all of its capabilities or configurations.

“A cell phone begins exposing location data the second it is powered on because it inherently trusts cellular networks and providers. Devices’ location data, from GPS to Wi-Fi or Bluetooth connections, may be acquired by others with or without the user or provider’s consent,” states the NSA. “Anything that sends and receives wireless signals has location risks similar to phones, including Internet of Things (IoT) devices, vehicles and many products with “smart” included in the name.”

In virtually all cases, the NSA will have different considerations for collecting and managing location data. For businesses, such information can be helpful to serve legitimate business needs. However, under the CCPA, for example, businesses need to provide “consumers” (which currently includes employees and applicants residing in California) with a notice at collection. This notice must explain the categories of personal information that the business collects, and one of those categories is geolocation data. The notice also must explain the purposes that such data will be used by the business. As businesses work through the process of rolling out new technologies, therefore, they’ll need to consider the scope of data collection, even if they are not interested in the data capable of being collected. If a business determines location data is not needed, the NSA guidance can be helpful as it provides mitigation tips to help limit the collection of same:

  • Disable location service settings on the device.
  • Disable radios when they are not actively in use: disable Bluetooth and turn off Wi-Fi if these capabilities are not needed.
  • Use Airplane Mode when the device is not in use.
  • Apps should be given as few permissions as possible (e.g. set privacy settings to ensure apps are not using or sharing location data).
  • Turn off settings (typically known as FindMy or Find My Device settings) that allow a lost, stolen, or misplaced device to be tracked.
  • Set browser privacy/permission location settings to not allow location data usage.
  • Use an anonymizing Virtual Private Network (VPN) to help obscure location.
  • Minimize the amount of data with location information that is stored in the cloud, if possible.

Of course, there are situations where the business will want to have location data collected, such on company-provided devices with Find My Device capabilities that allow lost, stolen, or misplaced devices to be located.

As many who have gone through compliance with the General Data Protection Regulation in the European Union, the CCPA and other laws that may come after it in the U.S. will require businesses to think more carefully about the personal information they collect, including location data. The NSA guidance is a helpful step in thinking about steps the business can take to apply best practices to its collection of location data.

 

 

NYDFS Files First Enforcement Action Under Reg 500

On July 21, 2020, the New York Department of Financial Services (“DFS”) filed its first enforcement action under New York’s Cybersecurity Requirements for Financial Services Companies, 23 N.Y.C.R.R. Part 500 (“Reg 500”).    Reg 500, which took effect in March 2017, imposes wide-ranging and rigorous requirements on subject organizations and their service providers, which are summarized here.

According to the Statement of Charges, First American Title Insurance Co. (“First American”) failed to remediate a vulnerability on its public-facing website, thereby exposing millions of documents containing sensitive consumer information – including bank account numbers, mortgage and tax records, social security numbers, wire transaction receipts, and drivers’ license images – to unauthorized access.  More specifically, DFS claims that First American failed to:

  • Conduct a security review and risk assessment of the vulnerability – steps that were mandated by the Company’s own cybersecurity policies;
  • Properly classify the level of risk associated with the website vulnerability;
  • Adequately investigate that vulnerability (the Company reviewed only a tiny fraction of the impacted documents and, as a result, severely underestimated the seriousness of the vulnerability); and
  • Heed the advice of the Company’s internal cybersecurity team, which advised that further investigatory actions were needed.

The foregoing failures, DFS contends, violated six provisions of Reg 500.  Specifically:

  1. 23 NYCRR 500.02: The requirement to maintain a cybersecurity program that is designed to protect the confidentiality, integrity and availability of the covered entity’s information systems, and which is based on the covered entity’s risk assessment.
  2. 23 NYCRR 500.03: The requirement to maintain a written policy or policies, approved by senior management, setting forth the covered entity’s policies and procedures for the protection of its information systems and the nonpublic personal information (“NPI”) stored on those systems.
  3. 23 NYCRR 500.07: The requirement to limit user access privileges to information systems that provide access to NPI and periodically review such access privileges.
  4. 23 NYCRR 500.09: The requirement to conduct a periodic risk assessment of the covered entity’s information systems to inform the design of its cybersecurity program.
  5. NYCRR 500.14(b): The requirement to provide regular cybersecurity awareness training for all personnel as part of the covered entity’s cybersecurity program, and to update such training to reflect risks identified by the covered entity in its risk assessment.
  6. NYCRR 500.15: The requirement to implement controls, including encryption, to protect NPI held or transmitted by the covered entity both in transit, over external networks, and at rest.

The case against First American is scheduled to proceed to an administrative hearing on October 26, 2020.  DFS is seeking civil penalties, along with an order requiring the Company to remedy its violations of Reg 500.  Each violation of Reg 500 carries a potential penalty of up to $1,000 and DFS is taking the position that each instance where NPI was subject to unauthorized access constituted a separate violation.  DFS alleges that hundreds of millions of documents were exposed to potential unauthorized access as a result of First American’s alleged violations and that, according to the Company’s own analysis, more than 350,000 documents were accessed without authorization as a result of the Company’s website vulnerability.  If DFS’s position on what constitutes a single violation prevails, First American could be exposed to hundreds of millions of dollars in civil penalties.

The case against First American may signal that DFS, after giving covered organizations several years to get their compliance programs in order, now intends to aggressively enforce Reg 500’s requirements.  To prepare for this eventuality, subject organizations need to closely scrutinize their compliance programs – including their policies and procedures for conducting security reviews and risk assessments, and for investigating and responding to security incidents – and take proactive steps to plug any gaps in those programs.  We have prepared several articles, blog posts, and webinars to help organizations determine what Reg 500 requires and to assess their compliance with those requirements:

National Biometric Information Privacy Act, Proposed by Sens. Jeff Merkley and Bernie Sanders

Whether it is facial recognition technology being used in connection with COVID-19 screening tools and in law enforcement, continued use of fingerprint-based time management systems, or the use of various biometric identifiers for physical security and access management, applications involving biometric identifiers and information in the public and private sectors continue to grow. Concerns about the privacy and security of that information continue to grow as well. Several states have laws protecting biometric information in one form or another, chief among them Illinois, but the desire for federal legislation remains.

Modeled after Illinois’s Biometric Information Privacy (BIPA), the National Biometric Information Privacy Act (Act), proposed by Sens. Jeff Merkley and Bernie Sanders, contains three key provisions:

  • A requirement to obtain consent from individual prior to collecting and disclosing their biometric identifiers and information.
  • A private right of action against entities covered by the Act that violate its protections which entitles aggrieved individuals to recover, among other things, the greater of (i) $1,000 in liquidated damages or (ii) actual damages, for negligent violations of the protections granted under the law.
  • An obligation to safeguard biometric identifier or biometric information in a manner similar to how the organization safeguards other confidential and sensitive information, such as Social Security numbers.

The Act would apply to “private entities,” generally including a business of any size in possession of biometric identifiers or biometric information of any individual. Federal, state, and local government agencies and academic institutions are excluded from the Act.

Under the Act, private entities would be required to:

  • Develop and make available to the public a written policy establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information. That schedule may not extend one year beyond an individual’s last interaction with the entity, but destruction could be required earlier;
  • Collect biometric identifiers or biometric information only when needed to provide a service to the individuals or have another valid business reason;
  • Inform individuals their biometric identifiers or biometric information is being collected or stored, along with the purpose and length of the collection, storage, or use, and must receive a written release from individuals which may not be combined with other consents, including an employment agreement;
  • Obtain a written release immediately prior to the disclosure of any biometric identifier or biometric information that includes the data to be disclosed, the reason for the disclosure, and the recipients of the data; and
  • Maintain the information using a reasonable standard of care.

Readers familiar with the BIPA in Illinois will find these requirements familiar. Readers familiar with the California Consumer Privacy Act (CCPA) will find the following “Right to Know” familiar as well. The Act would grant individuals the right to request certain information about biometric identifiers or biometric information collected by a covered entity within the preceding 12-month period. This information includes “specific pieces of personal information” and “the categories of third parties with whom the business shares the personal information.” The Act uses “personal information” but does not define it, leaving it unclear if it pertains only to biometric identifiers and biometric information.

Most troubling is the private right of action provision referenced above. The Act uses language similar to the language in the BIPA, which has led to a flood of class action litigation, including a decision by the IL Supreme Court finding plaintiffs need not show actual harm to recover under the law. The legislative process likely will result in some modification to the law, assuming it even survives, a fate privacy laws tend to have at the federal level. Nonetheless, we will continue to monitor the track of this and similar laws.

LexBlog