Combating Improper Robocalls: The TRACED Act Signed into Law

In the final days of 2019, the Telephone Robocall Abuse Criminal Enforcement and Deterrence Act (“TRACED Act”) was signed into law to combat the increasing number of illegal robocall practices and other intentional violations of telemarketing laws. The TRACED Act, a bipartisan bill, first introduced in Congress in 2018, broadens FCC authority to levy Telephone Consumer Protection Act (“TCPA”) civil penalties and extends the time period for the FCC to catch and take civil enforcement action against intentional violations. The new law will not put an immediate end to improper robocalling practices, which have been exacerbated in recent years due to the growing industry of “spoofing” technology, but will certainly cause individuals to think twice before engaging in illegal robocalling activity.

It is important to note that not all robocalling practices are illegal – generally robocalls are permissible if the company has received written consent from a consumer to call in that manner. There are also a few types of robocalls that are permissible without written consent: purely informational calls (e.g. flight cancellation, appointment reminders, schools delays), messages from certain healthcare providers, political calls, messages from charities and debt collection calls (excluding services that offer to reduce your debt).

Below are several key provisions of the TRACED Act likely to be impactful in curbing improper robocall activity:

  • Requires the FCC to promulgate rules to help protect consumers from receiving unwanted calls or text messages from a caller with an unauthenticated numbered. Note: The FCC’s rulemaking process for this provision has already been underway.
  • Requires the FCC to promulgate rules establishing when a provider may block a call based on information provided by a call authentication framework, and establishing a process to permit a calling party adversely affected by the authentication framework to verify the authenticity of their calls.
  • Requires the FCC and Department of Justice to assemble an interagency working group to study and report to Congress on the enforcement of the prohibition on certain robocalls – specifically looking into how to better enforce against robocalls by examining issues such as the types of policies, laws and constraints that may be inhibiting enforcement.
  • Requires the FCC to initiate a proceeding to determine whether its policies regarding access to numbers resources could be modified to aid in reducing access to numbers by potential robocall violators.
  • Requires voice service providers to develop call authentication technologies. Providers may not charge for these services, and are given a safe harbor from liability for making reasonable efforts to effectively implement such technology.
  • Implements a forfeiture penalty for violations (with or without intent) of the prohibition on certain robocalls.
  • Increases the TCPA fines for robocall violations and extends the FCC’s statute of limitations on such violations. A violator can be fined up to $10,000 per call.

Much praise has been directed towards the recently enacted TRACED Act, including Senator Chuck Schumer who highlighted on Twitter that “Americans were battered by 48 billion robocalls last year (2019)…I’m so proud I fought for the #TRACEDact” and FCC Chairman Ajit Pai in a statement on behalf of the FCC, “I applaud Congress for working in a bipartisan manner to combat illegal robocalls and malicious caller ID spoofing”. Nonetheless, only time will tell how effective the new law will be in deterring the practice of illegal robocalls, which only seems to be getting worse.

The recently enacted TRACED Act comes together with other areas of attention on the TCPA generally of late. In June 2019, the U.S. Supreme Court issued its long awaited decision in PDR Network LLC v. Carlton, addressing the issue of whether the Hobbs Act requires the district court to accept the 2006 FCC Order 2006, which provides the legal interpretation for the TCPA. Unfortunately, the Court dodged the issue, instead ruling anonymously that the lower court failed to consider two preliminary issues. A final decision in this case had been long-awaited, and the wait continues. There is also a growing circuit split over the definition of Automatic Telephone Dialing System (ATDS) under the TCPA, and the FCC recently sought comments from the public on the scope of the TCPA, including the ATDS definition. Needless to say 2020 should be an interesting year for the TCPA.

Websites: A Growing Compliance Concern – CCPA, HIPAA, Accessibility, State Laws…

Websites play a vital role for organizations. They facilitate communication with consumers, constituents, patients, employees, and the general public. They project an organization’s image and promote goodwill, provide information about products and services and allow for their purchase. Websites also inform investors about performance, enable job seekers to view and apply for open positions, and accept questions and comments from visitors to the site or app, among many other activities and functionalities. Because of this vital role, websites have become an increasing subject of regulation making them a growing compliance concern.

Currently, many businesses are working to become compliant with the California Consumer Privacy Act (“CCPA”) which, if applicable, requires the conspicuous posting of a privacy policy on a business’s website. But, the CCPA is not the first nor will it be the last compliance challenge for organizations that operate websites and other online services. However, the CCPA along with the flood of ADA accessibility litigation are causing many organizations to revisit their websites and online services to meet the growing compliance burden.

What are some of these requirements?

ADA Accessibility. When people think about accommodating persons with disabilities, they often are drawn to situations where a person’s physical movement in a public place is impeded by a disability – stairs to get into a library or narrow doorways to use a bathroom. Indeed, Title III of the Americans with Disabilities Act grants disabled persons the right to full and equal enjoyment of the goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation. Although websites were not around when the ADA was enacted, they are now, and courts are applying ADA protections to those sites. The question is whether a website or application is accessible.

Although not yet adopted by the Department of Justice, which enforces Title III of the ADA, guidelines established by the Website Accessibility Initiative appear to be the more likely place courts will look to access the accessibility of a website to which Title III applies. State and local governments have similar obligations under Title II of the ADA, and those entities might find guidance here.

HIPAA. For anyone who has had their first visit to a doctor’s office, they likely were greeted with a HIPAA “notice of privacy practices” and asked to sign an acknowledgement of receipt. Most covered health care providers have implemented this requirement, but may not be aware of the website requirement. HIPAA regulation 45 CFR 164.520(c)(3)(i) requires that covered entities maintaining a website with information about the entity’s customer services or benefits must prominently post its notice of privacy practices on the site and make the notice available electronically through site.

COPPA. The Children’s Online Privacy Protection Act (COPPA) was enacted to give parents more control concerning the information websites collect about their children under 13. Regulated by the Federal Trade Commission (FTC), COPPA requires websites and online services covered by COPPA to post privacy policies, provide parents with direct notice of their information practices, and get verifiable consent from a parent or guardian before collecting personal information from children. COPPA applies to websites and online services directed to children under the age of 13 that collect personal information, and to sites and online services geared toward general audiences when they have “actual knowledge” they are collecting information from children under 13. Find out more about compliance here.

FTCA. Speaking of the FTC, that agency also enforces the federal consumer protection laws, including section 5 of the Federal Trade Commission Act (FTCA) which prohibits unfair and deceptive trade practices affecting commerce. When companies tell consumers they will safeguard their personal information, including on their websites, the FTC requires that they live up these promises. Businesses should review their website disclosures to ensure they are not describing privacy and security protections that are not actually in place.

CCPA. As mentioned above, a CCPA-covered business that maintains a website must post a privacy policy on its website homepage through a conspicuous link using the word “privacy,” on the download or landing page of a mobile application. That is not all. The website must also provide certain mechanisms for consumers to contact the business about their CCPA rights, such as the right to require deletion of their personal information, and the right to opt-out of the sale of personal information. The latter must be provided through an interactive webform accessible via a clear and conspicuous link titled “Do Not Sell My Personal Information,” or “Do Not Sell My Info.”

GDPR. In 2018, the European Union’s General Data Protection Regulation (GDPR) became effective in 2018 and reached companies and organizations globally. In general, organizations subject to the GDPR which collect personal data on their websites must post a privacy policy on their website setting for the organization’s privacy practices.

CalOPPA. The California Online Privacy Protection Act (CalOPPA) requires commercial operators of online services, including websites and mobile and social apps, which collect personally identifiable information from Californians to conspicuously post a privacy policy. Privacy policies should address how companies collect, use, and share personal information. Companies can face fines of up to $2,500 each time a non-compliant app is downloaded.

Delaware and Nevada. In 2016, Delaware became the second state to have an online privacy protection act, requiring similar disclosures to those under CalOPPA. Nevada enacted website privacy legislation of its own. First, like DelOPPA and CalOPPA, NRS 603A.340 requires “operators” to make a privacy notice reasonably accessible to consumers through its Internet website or online service. That notice must, among other things, identify the categories of covered information the operator collects through the site or online service about consumers who use or visit the site or service and the categories of third parties with whom the operator may share such covered information. In general, an operator is a person who: (i) owns or operates an Internet website or online service for commercial purposes; (ii) collects and maintains covered information from consumers who reside in this State and use or visit the Internet website or online service; and (iii) engages in any activity that constitutes sufficient nexus with this State, such as purposefully directing its activities toward Nevada. Effective October 1, 2019, Nevada added to its website regulation by requiring operators to designate a request address on their websites through which a consumer may submit a verified request to opt out of the sale of their personal information.

 

This is by no means an exhaustive list of the regulatory requirements (we’ve focused generally on privacy and security) that may apply to your website or online service. Organizations should regularly revisit their websites not just to add new functionality or fix broken links. They should have a process for ensuring that the sites or services meet the applicable regulatory requirements.

The Case that Sparked the CCPA Gets an FTC Final Order

Recently, the U.S. Federal Trade Commission issued an important opinion, concluding that Cambridge Analytica, LLC, the data analytics and consulting company, engaged in “deceptive practices to harvest personal information” of tens of millions social media users, by way of using their data from a company developed app, GSRapp, for voter profiling purposes without the users’ knowledge or consent. In addition, the FTC found that Cambridge Analytica engaged in deceptive practices connected to their EU-US Privacy Shield (“Privacy Shield”) framework participation.

In particular the FTC opinion highlighted that Cambridge Analytica and its then CEO and GSRapp app developer deceived consumers, by falsely telling app users that it would not collect users’ names or other identifiable information, but then collected User IDs which allowed Cambridge Analytica access to users’ social media profiles containing identifiable information.

Regarding Cambridge Analytica’s deceptive Privacy Shield practices, the FTC concluded that Cambridge Analytica continued to claim participation in the Privacy Shield framework, after allowing its certification to pass. Moreover, the company failed to adhere to the Privacy Shield requirement that after ceasing participation in the framework, a company must affirm to the Department of Commerce that the company will continue to apply Privacy Shield protections to personal information that was collected during the time period the company participated in the framework.

The FTC’s Final Order prohibits Cambridge Analytica from making false representations regarding the extent to which it protects the privacy and confidentiality of personal information, and its participation in the Privacy Shield framework as well as other other similar regulatory or standard-setting organizations. Further, the company must continue to apply Privacy Shield framework protection to all personal information collected during the time period the company participated in the program, or alternatively delete or return the information. Finally, Cambridge Analytica must delete all personal information collected by the GSRapp.

The FTC’s opinion and order against Cambridge Analytica is particularly of relevance, as the newly effective California Consumer Privacy Act was a direct response to Cambridge Analytica’s deceptive practices towards user personal information, as well as other similar incidents of late. The CCPA creates extensive obligations for companies that handle consumer personal information, and provides consumers with enhanced control over their data, with the aim of preventing deceptive activity such as that of Cambridge Analytica. Key relevant CCPA provisions include:

Notice Obligations

  • A business that collects a consumer’s personal information must inform consumers, at or before the point of collection, as to the categories of personal information to be collected and the purposes for which the categories of personal information will be used. This does not include specific pieces of personal information.
  • A business must disclose certain information in an online privacy policy or on an internet website, as applicable. This information includes, without limitation, an explanation of the rights consumers have under the CCPA and certain information about the categories of personal information it collected, disclosed, or sold, as applicable. These disclosures must be updated every 12 months.

Consumer Rights

  • A consumer’s right to request information regarding the categories of personal information collected on them, the sources of that information (such as from an online survey or user profile as in the case of Cambridge Analytica), the categories of personal information used for business purposes or sold to third parties, and the “specific pieces” of information collected.
  • A consumer’s right to request that a business deletes personal information collected about them.

The CCPA is here (effective since January 1) and the development of a meaningful data protection program has never been more important. Jackson Lewis has established a CCPA Team that is available to answer questions regarding the CCPA and assist covered businesses in their compliance efforts.

CCPA Is Here, and it Does Have Requirements for Employees, Applicants, etc.

Image result for 2020 california CCPASome business leaders and HR professionals may be waking up this morning not realizing they must provide a “Notice at Collection” to some or all of their employees and applicants under the new California Consumer Privacy Act (CCPA). This is not surprising given the confusion during 2019 about whether this law would reach that far. The passage of AB 25 confirmed that while employees would be temporarily excluded from most of the CCPA’s protections, two areas of compliance remain: (i) providing a notice at collection, and (ii) maintaining reasonable safeguards for personal information driven by a private right of action now permissible for individuals affected by a data breach caused by a business’s failure to do so.

Before addressing these two employment-related aspects of the CCPA, it is helpful to remember which entities are subject to CCPA. The basic rule follows.

In general, the CCPA applies to a “business” that:

A. does business in the State of California,

B. collects personal information (or on behalf of which such information is collected),

C. alone or jointly with others determines the purposes or means of processing of that data, and

D. satisfies one or more of the following: (i) annual gross revenue in excess of $25 million, (ii) alone or in combination, annually buys, receives for the business’s commercial purposes, sells, or shares for commercial purposes, alone or in combination, the personal information of 50,000 or more consumers, households, or devices, or (iii) derives 50 percent or more of its annual revenues from selling consumers’ personal information.

For more information on this part of the law, please review Does the CCPA Apply to Your Business?

Notice at Collection

A “notice at collection” requires two pieces of information be communicated to the consumer/employee:

  1. The categories of personal information collected by the business. There are eleven categories of personal information, such as identifiers, geolocation data, biometric information, employment-related information, etc. See Cal. Civ. Code Sec. 1798.140(o).
  2. For each category, the uses of personal information by the business.

There are, of course, some questions employers may have about this notice, such as:

    • Who must get it? AB 25 refers to the following categories of “consumers” (natural persons who are California residents) – job applicants to, employees of, owners of, directors of, officers of, medical staff members of, or contractors of the business. Note, the CCPA does not define these terms, and recent proposed regulations do not address AB 25 at all. Guidance may come with final regulations.
    • When must they get it? The statute requires the notice to be provided at or before collection of personal information. In the case of applicants, that might mean providing the notice on the company’s website if, for example, it receives information from applicants on the site concerning open positions. In the case of employees, assuming different notices will be provided because more information is collected from employees, a notice at the beginning of the onboarding process, such as with offer letters, might make sense. Some employers may want to include the notice in employee handbooks, although this may not satisfy the “at or before collection” requirement. Handbooks typically are not provided until after some personal information has been collection from an employee, but it could provide employees a place for easy reference to the business’s practices concerning personal information.
    • Is notice required for current employees? It is true that businesses have already collected personal information about individuals working for the company prior to 2020. However, collection is an ongoing process. One of the categories of personal information, for example, is website browsing activity. Many businesses now continually track this activity if only to safeguard their systems and implement electronic communications and information systems policies.
    • Include information on where employees can go with questions? This is not currently required. Providing employees, applicants, others a place to go with questions, however, might be a good idea. Employees may have not received this kind of notice before and may have a number of questions. Designating individuals in the organization to address those questions, and directing employees and applicants to those individuals, would help to ensure consistent messaging about the business’s practices.

Reasonable Safeguards.

The second issue for employers under the CCPA is safeguarding employee personal information. Under the CCPA, California consumers, including employees and applicants, affected by a data breach can bring an action for statutory damages when the breach is caused by the business’s failure to maintain reasonable safeguards to protect a subset of personal information and following a 30-day cure period. A consumer can recover damages in an amount not less than $100 and not greater than $750 per incident or actual damages, whichever is greater, as well as injunctive or declaratory relief and any other relief the court deems proper.

There is no regulatory guidance in California concerning what it means to have “reasonable safeguards.” However, former California Attorney General Kamala Harris issued a 2016 data breach report in which she interpreted an existing California statute, Cal. Civ. Code 1789.81.5(b), to mean that businesses must at least satisfy the 20 controls in the Center for Internet Security’s Critical Security Controls in order to be considered reasonable. It is not clear if those controls will be sufficient to meet the CCPA’s standard, but they would be a good place to look for guidance. Note also that the “reasonably safeguard” obligation applies to a subset of personal information, namely:

An individual’s first name or first initial and his or her last name in combination with any one or more of the following data elements, when either the name or the data elements are not encrypted or redacted:

  1. Social security number,
  2. Driver’s license number, California identification card number, and government identifiers (i.e. tax identification number, passport number, military identification number),
  3. Account number, credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account,
  4. Medical information,
  5. Health insurance information, and
  6. Biometric identifiers.

Thus, businesses should be reviewing their data security policies and procedures not just with respect to consumer data, but also employment-related activities – payroll, benefits, recruiting, direct deposit, shared-services, background checks, etc. This also means evaluating what their third-party service providers are doing to protect personal information of employees, applicants, contractors, etc. Note other states also have similar mandates, including Colorado, Massachusetts and New York (coming soon in March 2020).

Businesses that find themselves subject to the CCPA should act quickly to satisfy their AB 25 requirements. Of course, this may be temporary because AB 25 sunsets on January 1, 2021. However, considering the current direction of privacy law, it seems likely that there will be more and not less privacy protections for employees by the end of 2020.

Personal Information, Private Information, Personally Identifiable Information…What’s the Difference?

When privacy geeks talk “privacy,” it is not uncommon for them to use certain terms interchangeably –personal data, personal information, personally identifiable information, private information, individually identifiable information, protected health information, or individually identifiable health information. They might even speak in acronyms – PI, PII, PHI, NPI, etc. Blurring those distinctions might be OK for casual conversation, but as organizations develop data privacy and security compliance programs, the meanings of these terms can have significant consequences. A good example exists within the California Consumer Privacy Act (“CCPA”) and its interaction with other laws.

The CCPA, effective January 1, 2020, contains an expansive definition of “personal information.” See Cal. Civ. Code Sec. 1798.140(o). The basic definition is information that identifies, relates to, describes, is reasonably capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. The definition goes on to enumerate, without limitation, certain categories of information (e.g., identifiers, website activity, biometric information, geolocation) if they identify, relate to, describe, are reasonably capable of being associated with, or could be reasonably linked, directly or indirectly, with a particular consumer or household. With respect to this broad set of data, the CCPA extends to California consumers substantial rights, including the right to request deletion of that data or to opt-out of its sale.

The CCPA’s private right of action for data breaches, however, applies to a much narrower subset of “personal information” defined above. Specifically, the CCPA incorporates another section of California law, Cal. Civ. Code Sec. 1798.81.5(d)(1)(A), to define personal information that, if breached, and which the owner failed to reasonably safeguard, could expose the owner to statutory damages of up to $750 per person. For this purpose, personal information means:

An individual’s first name or first initial and the individual’s last name in combination with any one or more of the following data elements…:

(i) Social security number.

(ii) Driver’s license number, California identification card number, tax identification number, passport number, military identification number, or other unique identification number issued on a government document commonly used to verify the identity of a specific individual.

(iii) Account number or credit or debit card number, in combination with any required security code, access code, or password that would permit access to an individual’s financial account.

(iv) Medical information.

(v) Health insurance information.

(vi) Unique biometric data generated from measurements or technical analysis of human body characteristics, such as a fingerprint, retina, or iris image, used to authenticate a specific individual.

Note also that the CCPA excludes certain information from its general definition of personal information, such as “protected health information” maintained by covered entities and business associates under the Health Insurance Portability and Accountability Act (“HIPAA”).

But the PI, PII, PHI…conundrum does not end with the CCPA. An organization with CCPA obligations also may maintain “private information” of New York residents. Under the New York Stop Hacks and Improve Electronic Data Security Act (“SHIELD Act”), that organization would have to adopt reasonable safeguards to protect “private information” which is defined to mean, in general, any information concerning a natural person which, because of an identifier, can be used to identify such natural person if it is in combination with any one or more of the following data elements:

  • social security number;
  • driver’s license number or non-driver identification card number;
  • account number, or credit or debit card number, which alone or together with a required code would permit access to an individual’s financial account;
  • biometric information, meaning data generated by electronic measurements of an individual’s unique physical characteristics, such as a fingerprint, voice print, retina or iris image, or other unique physical representation or digital representation of biometric data which are used to authenticate or ascertain the individual’s identity.

Private information also includes a user name or e-mail address in combination with a password or security question and answer that would permit access to an online account.

Confused yet? Perhaps your organization is not subject to the CCPA or the NY SHIELD Act, but you own and operate a website that collects personal information from consumers who reside in California and Delaware. Laws in those states require a website private policy that describes certain practices concerning “personally identifiable information” defined in Delaware to mean:

any personally identifiable information…collected online by the operator…from that user…including a first and last name, a physical address, an e-mail address, a telephone number, a Social Security number, or any other identifier that permits the physical or online contacting of the user, and any other information concerning the user collected by the operator…from the user and maintained in personally identifiable form in combination with any identifier described in this paragraph.

A similar definition exists under the California law. These distinctions just scratch the surface and add to the complexity of the emerging patchwork of data privacy and security law in the United States.

So, when thinking about personal information, it is important to remember that not only does the definition extend beyond just one’s name and social security number, but the term itself and its definition likely will differ depending on the particular statutes or regulations you are analyzing. When assessing an organization’s threats and vulnerabilities to personal information, or preparing policies and procedures to safeguard it, be sure to develop an appropriate definition that takes into account the necessary elements of data.

10 Steps for Tackling Data Privacy and Security Laws in 2020 for In-House Counsel and HR Pros

After years of data breaches, mass data collection, identity theft crimes, and failed attempts at broad-based federal legislation, 2020 may be the year that state privacy and data security legislation begins to take hold in the U.S. For example, the California Consumer Privacy Act (“CCPA”) and the New York Stop Hacks and Improve Electronic Data Security Act (“SHIELD Act”), both effective in 2020 and with application outside their respective states, are already spurring more active compliance efforts. This rapidly developing area of law presents a dizzying challenge for “compliance” personnel whose plates are already filled with an alphabet soup of regulation. The challenge tends to fall particularly hard on in-house counsel and human resources professionals and their IT counterparts whose teams (many times of only one or two) are frequently spread too thin.

The CCPA and SHIELD Act are by no means the only laws on the books. Other state legislatures, such as New Jersey, are advancing comprehensive data privacy and security laws. And, of course, many states have enacted similar laws – all 50 states enacted data breach notification laws, several states (e.g., Colorado, Florida, Illinois, Maryland, Massachusetts, Nevada, Oregon) require businesses to have reasonable safeguards to protect personal information, including written contracts with vendors that access personal information. On top of that, certain organizations must comply with industry-specific federal mandates, such as the Health Insurance Portability and Accountability Act (“HIPAA”) and the Gramm-Leach-Bliley Act (“GLBA”), while others are balancing international regulation, the most popular one being the European Union’s General Data Protection Regulation (“GDPR”).

Meeting this challenge can seem overwhelming, but there are some strategies and best practices that can help in 2020 and beyond.

  1. Set expectations. Compliance is not a one-time endeavor. It is an on-going effort, a marathon, not a sprint. Building a strong compliance and risk management program is necessary, but it will take time, resources, and commitment. The support of organization leadership is critical, so get them on board, apprise them of the costs of building an achievable program, and the costs of doing nothing.
  2. Build your team. The data privacy and security challenge cannot be solved by the IT department alone. Technology safeguards are critical, but they do not replace strong administrative, physical, and organizational controls. In-house counsel and HR professionals should work on eliminating silos and push for an interdisciplinary team – sales, finance, R&D, marketing, operations, legal, HR, IT. Collectively, the team should have deep institutional knowledge; a strong understanding of the business, its need for and uses of data, and threats and vulnerabilities to data; an awareness of industry expectations, and the capacity to influence new practices and procedures for processing data.
  3. Maintain a Written Information Security Program. It is not enough to say, “We are doing that.” From a compliance perspective, data privacy and security policies and procedures need to be in writing. And, written policies and procedures also help to maintain consistency in the organization’s practices and better support discipline for violations of the rules.
  4. Vendors – trust but verify. Third-party vendors provide critical support to organizations often involving access to sensitive information. The idiom “a chain is no stronger than its weakest link” is quite appropriate considering many organizations have experienced data breaches because of their vendors’ security incidents. Organizations simply must have a better understanding of the strength of their vendors’ safeguards for protecting information. They should maintain strong vendor management programs that begin to apply at procurement and continue until the service agreement terminates and the organization’s data is secured.
  5. Communications About Your Program Should be Accurate and Accessible. Increasingly, the law requires organizations to post website statements summarizing their data privacy and security practices. Examples include HIPAA and laws in California, Delaware, and Nevada. These statements should be accurate and accessible. Inaccurate statements, such as those that overstate security safeguards, can lead to deceptive trade practice claims. As required by the CCPA and urged by the flood of litigation under Title III of the Americans with Disabilities Act, the statements also need to be accessible to persons with disabilities.
  6. Know the Law and Stay in Touch. An organization’s compliance team need not and should not be comprised of lawyers. But it should maintain a keen awareness of applicable legal mandates and a general sense of where the law is headed as it relates to the organization. Active participation in trade and similar associations can be particularly helpful, as can subscribing to dedicated legal resources, blogs, etc.
  7. Training and Awareness. Employees falling victim to phishing attacks is one of the most frequent causes of a data breach. Regular, role-based training on the organization’s policies and procedures along with general security awareness training can substantially reduce this and other data risks.
  8. Embrace technology…carefully. The latest devices and software applications can benefit the organization’s business enormously. However, they may not have been developed or designed with data privacy and security in mind, or at least as needed to address the organization’s compliance needs. Consider biometric technologies that tout stronger identity verification for applications such as POS system access and worker time management. If not rolled out or configured carefully, these devices can cause significant legal exposure relating to the collection, disclosure, and destruction of personal information.
  9. Less is more. Some organizations pride themselves on their comprehensive recordkeeping systems, for example, claiming to have retained all records since inception. Such practices may not be necessary, and in many cases are not prudent. Retaining massive amounts of data may be needed in certain contexts, but it should be carried out strategically and deliberately, with a plan to shed the data once its usefulness has ceased.
  10. Be reasonable. Perhaps this should be first on the list. But it is last to serve as a reminder that whatever steps are taken, they should be reasonable. Indeed, most regulatory data privacy and security frameworks require “reasonable” safeguards. Of course, this is not easy to define, but reasonableness should be a fundamental principle guiding your program.

 

With 2020 poised to bring more acuity to the direction of privacy and security law in the U.S., adopting some or all of the above strategies and best practices will help support a strong, adaptive, ongoing, and reasonable privacy and information security program.

CCPA Notice of Collection – Are You Collecting Geolocation Data, But Do Not Know It?

Businesses subject to the California Consumer Privacy Act (“CCPA”) are working diligently to comply with the law’s numerous mandates, although final regulatory guidance has yet to be issued. Many of these businesses are learning that AB25, passed in October, requires employees, applicants, and certain other California residents to be provided a notice of collection at least for the next 12 months. These businesses need to think about what must be included in these notices.

A Business Insider article explains that iPhones maintain a detailed list of every location the user of the phone frequents, including how long it took to get to that location, and how long the user stayed there. The article provides helpful information about where that information is stored on the phone, how the data can be deleted, and, perhaps more importantly, how to stop the tracking of that information. This information may be important for users, as well as companies that provide iPhones to their employees to use in connection with their work.

AB25 excepted natural persons acting as job applicants, employees, owners, directors, officers, medical staff members, and contractors of a CCPA-covered business from all of the CCPA protections except two: (i) providing them a notice of collection under Cal. Civ. Code Sec. 1798.100(b), and (ii) the right to bring a private civil action against a business in the event of a data breach caused by the business’s failure to maintain reasonable safeguards to protect personal information. The notice of collection must inform these persons as to the categories of personal information collected by the business and how those categories are used.

The CCPA’s definition of personal information includes eleven categories of personal information, one of which is geolocation data. As many businesses think about the categories of personal information they collect from employees, applicants, etc. for this purpose, geolocation may be the last thing that comes to mind. This is especially true for businesses with workforces that come into the office every day, and which do not have a business need to know where their employees are, such as transportation, logistics, and home health care businesses. But, they still may provide their workforce members a company-owned iPhone or other smart device with similar capabilities, although not realizing all of its capabilities or configurations.

As many who have gone through compliance with the General Data Protection Regulations in the European Union, the CCPA and other laws that may come after it in the U.S. will require businesses to think more carefully about the personal information they collect. They likely will find such information is being collected without their knowledge and not at their express direction, and they may have to communicate that collection (and use) to their employees.

New Year, New Shields: How Can You Prepare for the New York SHIELD Act?

As we’ve previously reported, the New York Stop Hacks and Improve Electronic Data Security Act (the “SHIELD Act”) goes into effect on March 21, 2020. The SHIELD Act, which amends the State’s current data breach notification law, imposes more expansive data security and data breach notification requirements on companies, in the hope of ensuring better protection for New York residents from data breaches of their private information. In anticipation of the SHIELD Act’s effective date, over the next several months we will highlight various aspects of the new law, and how to prepare. Under the Act, individuals and businesses who collect computerized data including private information about New York residents must implement and maintain reasonable administrative, physical and technical safeguards. The Act provides several safeguards which may be implemented to ensure compliance.

Administrative Safeguards

  • Designate individual(s) responsible for security programs;
  • Conduct risk assessments;
  • Train and manage employees in security program practices and procedures;
  • Select capable service providers and require safeguards by contract; and
  • Adjust program(s) in light of business changes or new circumstances.

Physical Safeguards:

  • Assess risks of information storage and disposal;
  • Detect, prevent, and respond to intrusions;
  • Protect against unauthorized access/use of private information during or after collection, transportation and destruction/disposal; and
  • Dispose of private information within a reasonable amount of time after it is no longer needed for business purposes.

 Technical Safeguards:

  • Assess risks in network and software design;
  • Assess risks in information processing, transmission and storage;
  • Detect, prevent, and respond to attacks or system failures; and
  • Regularly test and monitor the effectiveness of key controls, systems and procedures.

In addition to the safeguards recommended in the Act, organizations should also consider the following:

  • Developing access management plans;
  • Maintaining written policies and procedures;
  • Applying sanctions to individuals who violate the organization’s data privacy and security policies and procedures;
  • Implementing facility security plans;
  • Maintaining and practicing disaster recovery and business continuity plans;
  • Tracking inventory of equipment and devices;
  • Deploying encryption and data loss prevention tools;
  • Develop and practice an incident response program;
  • Regularly updating antivirus and malware protections;
  • Utilizing two factor authentication; and
  • Maintaining and implementing a record retention and destruction policy.

With the effective date of the SHIELD Act inching closer, covered businesses should be assessing their data security programs and making adjustments as necessary to ensure compliance with the new law. As a reminder, while there are more flexible standards for small businesses (with fewer than 50 employees and less than $3 million per year in gross revenue), these businesses still must implement a reasonable security program appropriate for the size and complexity of their business. Moreover, other state statutes and regulations must be factored into the security program. Additional resources on security program implementation are available here:

FCC Rules Online Faxes Are TCPA Exempt

The Telephone Consumer Protect Act (“TCPA”) has seen lots of action in 2019, and in the final days of the year the Federal Communications Commission (“FCC”) issued a significant ruling concluding that “online fax services” i.e. e-faxes are outside the scope of the TCPA. The FCC’s ruling effectively prevents the common “junk fax” class action lawsuits against companies that send out e-faxes, assuming those faxes are not delivered to a traditional fax machine.

In 2005, the TCPA, which restricts telephone solicitations and use of automated telephone equipment, was amended to include the Junk Fax Prevention Act (JFPA) that restricted the use of the fax machines to deliver unsolicited advertising.

The FCC ruling stems from a 2017 petition by Amerifactors asking the FCC to clarify that faxes sent to “online fax services” are not faxes sent to “telephone facsimile machines”, and therefore do not violate the TCPA. An online fax service is “a cloud-based service consisting of a fax server or similar device that is used to send or receive documents, images and/or electronic files in digital format over telecommunications facilities” that allows users to “access ‘faxes’ the same way that they do email: by logging into a server over the Internet or by receiving a pdf attachment [as] an email.” At the time, Amerifactors was defending a class action suit on claims that it violated the TCPA, where the bulk of messages sent to consumers were from “online fax services.”

In finding that “online fax services” are not considered “telephone facsimile machines” the FCC turned to the plain language of the TCPA. The TCPA’s language demonstrates that Congress did not intend the statute’s prohibition to apply to faxes sent to equipment other than a “telephone facsimile machine”. In addition, the FCC highlights precedent dating back to 2003 that faxes “sent as email over the Internet” are not subject to the TCPA. Faxes sent by online fax services via an attachment that the consumer can delete without printing are effectively the same as “email sent over the Internet”.

Importantly, the FCC notes that faxes sent by online fax services do not lead to the “specific harms” to consumers Congress sought to address in the TCPA.

“The House Report on the TCPA makes clear that the facsimile provisions of the statute were intended to curb two specific harms: “First, [a fax advertisement] shifts some of the costs of advertising from the sender to the recipient. Second, it occupies the recipient’s facsimile machine so that it is unavailable for legitimate business messages while processing and printing the junk fax.” The record is clear that faxes sent to online fax services do not pose these harms and, in fact give consumers tools such as blocking capabilities to control these costs.”

 This ruling is considered a win for businesses, and will likely have a sweeping impact on litigation in this area. Stay tuned for more TCPA related developments in the coming year.

Are shareholders considered “consumers” under the CCPA?

It’s hard to understate the range of issues the California Consumer Privacy Act (the “CCPA”) raises for covered businesses and their service providers. One of those issues involves the meaning of “consumer.” If you have been following CCPA developments, you know that at least for the first 12 months the CCPA is effective, the new law will, to a limited extent, apply to personal information of certain employees, applicants, and contractors. See AB 25.

But what about a covered business’s shareholders? Shareholders may not buy goods and services from the business, and they may not be employees of the business. However, some covered businesses, whether public or private, have shareholders who are natural persons residing in California, and from whom the business collects personal information. For example, businesses might collect personal information from shareholders through their investor relations websites, or the information might be collected on their behalf by third parties. Businesses subject to the CCPA should be considering what steps they need to take with respect to their shareholders or similarly-situated “consumers.”

In general, the CCPA defines “consumer” to mean a natural person who is a California resident. See Cal Civ. Code Sec. 1798.140(g). That definition would seem to include shareholders of the business who are natural persons residing in California. However, there is a question of whether, in their role as shareholders, they would fit under the changes made by AB25.

In general, the changes made by AB25 apply to personal information collected by a business about a natural person in the course of such person acting as a job applicant to or an employee, owner, director, officer, medical staff member, or contractor of that business, and to the extent the person’s personal information is collected and used by the business solely within the context of the natural person’s role or former role as a job applicant to or an employee, owner, director, officer, medical staff member, or contractor of that business.

That is a mouthful, but if shareholders are “owners,” shouldn’t they be covered by AB 25? Not in all cases. For purposes of this section of the law, “owner” means a natural person who either:

  1. Has ownership of, or the power to vote, more than 50 percent of the outstanding shares of any class of voting security of a business.
  2. Has control in any manner over the election of a majority of the directors or of individuals exercising similar functions.
  3. Has the power to exercise a controlling influence over the management of a company.

Shareholders without the ownership, control, or power noted above would not be considered “owners” for purposes of the changes made by AB 25. Additionally, for those shareholders, it does not appear that the “B2B” exception added under AB 1355 would apply. The relevant language in AB 1355 provides:

Personal information reflecting a written or verbal communication or a transaction between the business and the consumer, where the consumer is a natural person who is acting as an employee, owner, director, officer, or contractor of a company, partnership, sole proprietorship, nonprofit, or government agency and whose communications or transaction with the business occur solely within the context of the business conducting due diligence regarding, or providing or receiving a product or service to or from such company, partnership, sole proprietorship, nonprofit or government agency

Shareholders likely would not be engaged in this kind of activity in their role as shareholders.

Last week, the public comment period for the proposed regulations issued in October by Attorney General Xavier Becerra closed, and final regulations are expected shortly. Absent clarification by the Attorney General on whether CCPA obligations reach shareholders of a business, covered businesses should be considering shareholders as part of their compliance efforts.

LexBlog