Last month, the Illinois Supreme Court heard oral argument in the closely watched case of Cothron v. White Castle System Inc., and is set to decide when claims under Sections 15(b) and 15(d) of the Illinois Biometric Information Privacy Act accrue.

The court’s forthcoming decision in Cothron is likely to have a significant impact on Illinois employers who are facing BIPA litigation, or who use, or have used, biometric technology in the workplace.

Read Full Article at Law360

Subscription may be required to view article

Co-authors: Nadine C. Abrams and Richard Mrizek 

In a ruling that may have significant impact on the constant influx of biometric privacy suits under the Biometric Information Privacy Act (BIPA) in Illinois, the Illinois Supreme Court will soon weigh in on whether claims under Sections 15(b) and (d) of the BIPA, 740 ILCS 14/1, et seq., “accrue each time a private entity scans a person’s biometric identifier and each time a private entity transmits such a scan to a third party, respectively, or only upon the first scan and first transmission.” Adopting a “per-scan” theory of accrual or liability under the BIPA would lead to absurd and unjust results, argued a friend-of-the-court brief filed by Jackson Lewis in Cothron v. White Castle Systems, Inc., in the Illinois Supreme Court, on behalf of a coalition of trade associations whose more than 30,000 members employ approximately half of all workers in the State of Illinois.

To date, more than 1,450 class action lawsuits have been filed under BIPA. Businesses that collect, use, and store biometric data should be tracking the Cothron decision closely.  The full update on Jackson Lewis’s brief in the Cothron case before the Illinois Supreme Court is available here.

 

 

 

Some members of the California legislature want their state to remain the leader for data privacy and cybersecurity regulation in the U.S. This includes protections for biometric information, similar to those under the Biometric Information Privacy Act in Illinois, 740 ILCS 14 et seq. (BIPA). State Senator Bob Wieckowski introduced SB 1189 on February 17, 2022, which would add protections for biometric information in his state on top of other statutory provisions, such as the California Privacy Rights Act (CPRA) which goes into effect January 1, 2023.

If enacted, SB 1189 would significantly expand privacy and security protection for biometric information in California and likely influence additional legislative activity in the U.S. Notably, unlike some of the limitations on application in the California Consumer Privacy Act (CCPA), the Bill would apply to any private entity (defined as an individual, partnership, corporation, limited liability company, association, or similar group, however organized, other than the University of California). It could also open the door to a wave of litigation, similar to what organizations subject to the BIPA currently face.

SB 1189 includes a fairly broad definition of biometric information, tracking the definition under the CCPA that went into effect January 1, 2020:

(1) “Biometric information” means a person’s physiological, biological, or behavioral characteristics, including information pertaining to an individual’s deoxyribonucleic acid (DNA), that can be used or is intended to be used, singly or in combination with each other or with other identifying data, to establish individual identity.

(2) Biometric information includes, but is not limited to, imagery of the iris, retina, fingerprint, face, hand, palm, vein patterns, and voice recordings, from which an identifier template, such as a faceprint, a minutiae template, or a voiceprint, can be extracted, and keystroke patterns or rhythms, gait patterns or rhythms, and sleep, health, or exercise data that contain identifying information.

Many are familiar with or have encountered devices that scan fingerprints or a person’s face which may capture or create biometric information. This definition appears to go beyond those more “traditional” technologies. So, for example, if you’ve developed a unique style for tapping away at your keyboard while at work, you might be creating biometric information. The contours of this definition are quite vague, so private entities should carefully consider the capturing of certain data sets and the capabilities of new devices, systems, equipment, etc.

The Bill would prohibit private entities from collecting, capturing, purchasing, etc. a person’s biometric information unless the private entity:

  • requires the biometric information either to: (i) provide a service requested or authorized by the subject of the biometric information, or (ii) satisfy another valid business purpose (as defined in the CCPA) which is included in the written public policy described below, AND
  • first (i) informs the person or their legally authorized representative, in writing, of both of the biometric information being collected, stored, or used, and the specific purpose and length of time for which the biometric information is being collected, stored, or used, and (ii) receives a written release executed by the subject of the biometric information or their legally authorized representative.

In this regard, SB 1189 looks a lot like the BIPA, with some additional requirements for the written release. For example, the written release may not be combined with an employment contract or another consent form.

Under SB 1189, private entities in possession of biometric information also would be required to develop and make available to the public a written policy that establishes a retention schedule and guidelines for destroying biometric information. In general, destruction of the information would be required no later than one year after the individual’s last intentional interaction with the private entity. This is similar to the period required in the Texas biometric law.

In addition to requiring reasonable safeguards to protect biometric information, the Bill would place limitations on the disclosure of biometric information. Unless disclosed to complete a financial transaction requested by the data subject or disclosed as required by law, a written release would be required to disclose biometric information. The release would need to indicate the data to be disclosed, the reason for the disclosure, and the intended recipients.

Perhaps the most troubling provision of the Bill for private entities is section 1798.306. Again, looking a lot like the BIPA, SB 1189 would establish a private right of action permitting individuals to allege a violation of the law and bring a civil action for any of the following:

  • The greater of (i) statutory damages between $100 and $1,000 per violation per day, and (ii) actual damages.
  • Punitive damages.
  • Reasonable attorney’s fees and litigation costs.
  • Any other relief, including equitable or declaratory relief, that the court determines appropriate.

Though still early in the legislative process for SB 1189, its introduction illustrates a continued desire by state and local lawmakers to enact protections for biometric information. See, e.g., recent developments in New York, Maryland, and Oregon described in our Biometric Law Map. Before implementing technologies or systems that might involve biometric information, private entities need to carefully consider the emerging legislative landscape.

Effective July 9, 2021, certain retail and hospitality businesses that collect and use “biometric identifier information” from customers will need to post conspicuous notices near all customer entrances to their facilities.  These businesses will also be barred from selling, leasing, trading, sharing or otherwise profiting from the biometric identifier information they collect from customers.  Customers will have a private right of action to remedy violations, subject to a 30-day notice and cure period, with damages ranging from $500 to $5,000 per violation, along with attorneys’ fees.

These new requirements, which are set forth in an amendment to Title 22 of the NYC Admin. Code (the “Amendment”), apply to “commercial establishments,” a three-pronged category that includes:

  1. Food and drink establishments: Establishments that give or offer for sale to the public food or beverages for consumption or use on or off the premises, or on or off a pushcart, stand or vehicle.
  2. Places of entertainment: Privately or publicly owned and operated entertainment facilities, such as a theaters, stadiums, arenas, racetracks, museums, amusement parks, observatories, or other places where attractions, performances, concerts, exhibits, athletic games or contests are held.
  3. Retail stores: Establishments wherein consumer commodities are sold, displayed or offered for sale, or where services are provided to consumers at retail.

The Amendment broadly defines “biometric identifier information” as a physiological or biological characteristic used to identify an individual including, but not limited to: (i) a retina or iris scan, (ii) a fingerprint or voiceprint, (iii) a scan of hand or face geometry, or any other identifying characteristic.

The Amendment will take effect amidst a flurry of data privacy and security activity in New York.

  • Last year, the New York Department of Financial Services (“DFS”) filed its first enforcement action under New York’s Cybersecurity Requirements for Financial Services Companies, 23 N.Y.C.R.R. Part 500 (“Reg 500”). DFS also announced a $1.5 million settlement with a residential mortgage services provider earlier this year.
  • In another recent development, the Stop Hacks and Improve Electronic Data Security Act (“SHIELD Act”), which took effect in March 2020, requires organizations that own or license private information related to New York residents to, among other things, develop, implement, and maintain reasonable safeguards to protect that information, which includes biometric information.
  • Building on the momentum from Reg 500 and the SHIELD Act, several additional privacy bills are currently under consideration:
  • One is the Biometric Privacy Act, which, if enacted could make New York the next hotbed of class action litigation over biometric privacy.
  • Another is the Tenant Privacy Act, which, among other things, would require owners of “smart access” buildings – i.e., those that use key fobs, mobile apps, biometric identifiers, or other digital technologies to grant access to their buildings – to provide privacy policies to their tenants prior to collecting certain types of data from them, as well as to strictly limit (a) the categories and scope of data that the building owner collects from tenants, (b) how it uses that data (including a prohibition on data sales), and (c) how long it retains the data.
  • Additionally, New York is considering two bills – S567 and A680 – which would grant consumers sweeping privacy rights, comparable to those available under the CCPA in California and CDPA in Virginia.

Jackson Lewis’ Privacy, Data & Cybersecurity Group has been closely monitoring these fast-moving developments and is available to assist organizations with their compliance and risk mitigation efforts.

On January 13, House Delegate Sara Love Introduced the “Biometric Identifiers and Biometric Information Privacy Act” (the “Act”) substantially modeled after the Biometric Information Privacy Act in Illinois, 740 ILCS 14 et seq. (the “BIPA”). Enacted in 2008, the Illinois BIPA only recently triggered an avalanche of class actions in Illinois, spurring other legislative activity, including in New York. If enacted, Maryland’s Act would become effective January 1, 2022.

Just like the BIPA and the proposed law in the Empire State, the Act would establish rules for “private entities” possessing “biometric identifiers” and “biometric information” of a person, such as:

  • Development of a publicly available policy establishing retention and destruction guidelines,
  • Mandated reasonable safeguards relating to the storage, transmission, and disclosure of such information in a manner at least as protective as for “confidential and sensitive information,” such as social security numbers and account numbers,
  • Prohibiting private entities from profiting from the information, and
  • Limited right to disclose without consent.

Unlike the BIPA, the Maryland bill would clarify the policy need not be publicly available when it applies only to employees and is used only for internal operations.

Most important, the Act also would create a private right of action for persons “aggrieved” by violations of the Act, using language similar to the BIPA, permitting persons to recover the greater of (i) statutory damages of at least $1,000 for each negligent violation, or $5,000 for each intentional or reckless violation, and (ii) actual damages.

We know the Illinois Supreme Court decided that, in general, persons bringing suit under the BIPA do not need to allege actual injury or adverse effect, beyond a violation of their rights under the BIPA, in order to qualify as an “aggrieved” person and be entitled to seek liquidated damages, attorneys’ fees and costs, and injunctive relief under the BIPA. See Rosenbach v. Six Flags Entertainment Corp.

As with the proposed BPA in New York, Maryland’s Act is not yet the law. However, if enacted, private entities covered by the Act should promptly take steps to comply. That is, they should review their time management, point of purchase, physical security, or other systems that obtain, use, or disclose biometric identifiers or biometric information against the requirements under the Act. Biometric identifiers under the Act include data of an individual generated by automatic measurements of that individual’s biological characteristics such as fingerprint, voiceprint, genetic print, retina or iris image, or any other unique biological characteristic that can be used to uniquely authenticate the individual’s identity. In this respect, the Act would be broader than the BIPA – in Illinois, a biometric identifier is limited to a retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry. There are, however, exclusions from biometric identifiers under the Act, such as writing samples, photographs, demographic data, physical descriptions (such as height and weight), and protected health information covered by HIPAA.

In the event private entities find technical or procedural gaps in compliance – such as not having a retention and destruction policy concerning such information or obtaining consent to provide biometric information to a third party – they should quickly remedy those gaps.

It is unclear whether courts in Maryland will interpret the availability of remedies under the Act, if enacted, the same as the Illinois Supreme Court in Rosenbach. However, if they do, the duties imposed on private entities subject to the law regarding the possession, retention, disclosure, safeguarding, and destruction of a person’s biometric identifiers or biometric information will define the statutory rights of persons protected by the law. Accordingly, when a private entity fails to comply with one of the Act’s requirements, that violation could constitute an invasion, impairment, or denial of a right under the Act resulting in the person being “aggrieved” and entitled to seek recovery.

Dubbed the “Biometric Privacy Act,” New York Assembly Bill 27 (“BPA”) is virtually identical to the Biometric Information Privacy Act in Illinois, 740 ILCS 14 et seq. (BIPA). Enacted in 2008, BIPA only recently triggered thousands of class actions in Illinois. If the BPA is enacted in New York, it likely will not take as long for litigation to begin under the new privacy law. Interestingly, late last year, Governor Cuomo signed AB A6787D which, among other things, prohibited the use of biometric identifying technology in schools at least until July 1, 2022.

Just like BIPA, the BPA would establish a comprehensive set of rules for companies possessing and/or collecting “biometric identifiers” and “biometric information” of a person, such as:

  • Development of a publicly available policy establishing retention and destruction guidelines
  • Informed consent required prior to collection
  • Limited right to disclose without consent
  • Mandated security and confidentiality safeguards
  • Prohibiting private entities from profiting from the data

Most important, the BPA also would create a private right of action for persons “aggrieved” by violations of BPA, using the same language as under BIPA, permitting persons to recover the greater of (i) statutory damages of at least $1,000 for each negligent violation, or $5,000 for each intentional or reckless violation, and (ii) actual damages.

We know the Illinois Supreme Court decided that, in general, persons bringing suit under BIPA do not need to allege actual injury or adverse effect, beyond a violation of his/her rights under BIPA, in order to qualify as an “aggrieved” person and be entitled to seek liquidated damages, attorneys’ fees and costs, and injunctive relief under the BIPA. See Rosenbach v. Six Flags Entertainment Corp.

Of course, the BPA is not currently the law in New York. However, if enacted, companies should immediately take steps to comply. That is, they should review their time management, point of purchase, physical security, or other systems that obtain, use, or disclose biometric information (any information, regardless of how it is captured, converted, stored, or shared, based on an individual’s retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry used to identify an individual) against the requirements under the BPA. In the event they find technical or procedural gaps in compliance – such as not providing written notice, obtaining a release from the subject of the biometric information, obtaining consent to provide biometric information to a third party, or maintaining a policy and guidelines for the retention and destruction of biometric information – they need to quickly remedy those gaps.

It is unclear whether courts in New York will interpret the availability of remedies under BPA, if enacted, the same as the Illinois Supreme Court in Rosenbach. However, if they do, the duties imposed on private entities subject to the law regarding the collection, retention, disclosure, and destruction of a person’s biometric identifiers or biometric information will define the statutory rights of persons protected by the law. Accordingly, when a private entity fails to comply with one of the BPA requirements, that violation could constitute an invasion, impairment, or denial of a right under the BPA resulting in the person being “aggrieved” and entitled to seek recovery.

As organizations aim to return to some type of normalcy, and help ensure a healthy and safe workplace, many have implemented COVID-19 screening programs that check for symptoms, and an employee’s recent travel and potential contact with the virus. Moreover, many states and localities across the nation are mandating or recommending the implementation of COVID-19 screening programs in the workplace, and beyond. In many cases, organizations have leveraged various technologies, such as social distancing bands, apps, and thermal scanners, to streamline their screening programs.

Despite the benefits of COVD-19 screening programs, organizations should proceed carefully to examine not only whether the particular solution will have the desired effect, but whether it can be implemented in a compliant manner with minimal legal risk, particularly regarding the privacy and security implications. Just last week Amazon was hit with a proposed class action lawsuit in Illinois state court, claiming the company’s COVID-19 screening program violated Illinois’s Biometric Information Privacy Act (BIPA).  According to the complaint, Amazon employees were required to undergo facial geometry scans and temperature scans before entering company warehouses, without prior consent from employees as required by law when collecting biometrics identifiers, such as a facial geometry scan.

The BIPA sets forth a comprehensive set of rules for companies doing business in Illinois when collecting biometric identifiers or information of state residents. The BIPA has several key features: • Informed consent prior to collection • Limited right of disclosure of biometric information • Written policy requirement addressing retention and data destruction guidelines • Prohibition on profiting from biometric data • A private right of action for individuals harmed by BIPA violations. Statutory damages can reach $1,000 for each negligent violation, and $5,000 for each intentional or reckless violation.

The complaint alleges that Amazon employees “lost the right to control” how their biometric data was collected, used and stored, exposing them to “ongoing, serious, and irreversible privacy risks — simply by going into work”.  In addition to claims of failure to notify employees and obtain express consent regarding their biometric data collection practices, the complaint also alleges that Amazon failed to develop and follow a publicly available retention schedule and guidelines for permanently destroying workers’ biometric data.

While this case is an important reminder of BIPA implications, implementing a COVID-19 screening program, or any type social distancing or contact tracing technology to help prevent/limit the spread of coronavirus for that matter, can have privacy and security implications that extend well beyond the BIPA. In addition to the BIPA, depending on the type of data being collected and who is collecting it, such practices may trigger compliance obligations under several federal laws, such as the Americans with Disabilities Act (ADA), the Genetic Information Nondiscrimination Act (GINA), and the Health Insurance Portability and Accountability Act (HIPAA). In addition to BIPA, other state laws should be considered, if applicable, such as the California Consumer Privacy Act (CCPA) and state laws that require reasonable safeguards to protect personal information and notification in the event of a data breach. International laws, including the General Data Protection Regulation (GDPR) also can affected screening programs depending on their scope. In addition to statutory or regulatory mandates, organizations will also need to consider existing contracts or services agreements concerning the collection, sharing, storage, or return of data, particularly for service providers supporting the screening program.  Finally, whether mandated by law or contract, organizations should still consider best practices to help ensure the privacy and security of the data it is responsible for.

COVID-19 screening programs, as well as the extensive technology at our disposal and/or in development are certainly helping organizations address the COVID-19 pandemic, ensuring a safe and health workplace and workforce, and preventing future pandemics.  Nevertheless, organizations must consider the legal risks, challenges, and requirements with any such technology prior to implementation.

Trump Administration To Test Biometric Program To Scan Faces Of Drivers |  Zero Hedge

Earlier this month, our Immigration Group colleagues reported the Department of Homeland Security (DHS) would release a new regulation to expand the collection of biometric data in the enforcement and administration of immigration laws. However, as reported by Roll Call, a DHS Inspector General report raised significant concerns about whether Department is able to adequately protect sensitive biometric information, particularly with regard to its use of subcontractors. The expanded use of biometrics outlined in the Department’s proposed regulation, just as increased use of biometric information such as fingerprint or facial recognition by private organizations, heightens the risk to such data.

The amount of biometric information maintained by DHS is already massive. The DHS Office of Biometric Identity Management maintains the Automated Biometric Identification System, which contains the biometric data repository of more than 250 million people and can process more than 300,000 biometric transactions per day. U.S. Customs and Border Protection (CBP) is mandated to deploy a biometric entry/exit system to record arrivals and departures to and from the United States, with the long-term goal to biometrically verify the identity of all travelers exiting the United States and ensure that each traveler has physically departed the country at air, land, and sea departure locations.

In 2018, CBP began a pilot effort known as the Vehicle Face System (VFS) in part to test the ability to capture volunteer passenger facial images as they drove by at speeds under 20 mph and the ability to biometrically match captured images against a gallery of recent travelers. DHS hired a subcontractor to assist with the development of the technology.

According to the inspector general’s report, DHS has a range of policies and procedures to protect biometric information, which it considers sensitive personally identifiable information (SPII). Among those policies, DHS’ Handbook for Safeguarding Sensitive PII, Privacy Policy Directive 047-01-007, Revision 3, December 2017, requires contractors and consultants to protect SPII to prevent identity theft or other adverse consequences, such as privacy incidents, compromise, or misuse of data information on them.

Despite these policies, the DHS subcontractor engaged to support the pilot directly violated DHS security and privacy protocols when it downloaded SPII, including traveler images, from an unencrypted device and stored it on its own network. The subcontractor obtained access to this data between August 2018 and January 2019 without CBP’s authorization or knowledge. Later in 2019, the subcontractor’s network was subjected to a malicious cyberattack involving ransomware resulting in the compromise of 184,000 facial images of cross-border travelers collected through a pilot program, at least 19 of which were posted on the dark web.

As one of our 10 Steps for Tackling Data Privacy and Security Laws, “Vendors – trust but verify” is critical. For DHS, its failure to do so may damage the public’s trust resulting in travelers’ reluctance to permit DHS to capture and use their biometrics at U.S. ports of entry. Non-governmental organizations that experience a similar situation with one of their vendors face an analogous loss of trust, as well as adverse impacts on business, along with compliance enforcement and litigation risks.

Among the recommendations CBP made following the breach was to ensure implementation of USB device restrictions and to apply enhanced encryption methods. CBP also sent a memo requiring all IT contractors to sign statements guaranteeing compliance with contract terms related to IT and data security.  Like DHS, more organizations are developing written policies and procedures following risk assessments and other best practices. However, it is not enough to prepare and adopt policies, implementation is key.

A growing body of law in the United States requires not only the safeguarding of personal information, including biometric information, by organizations that own it, but also by the third-party service providers that process it on behalf of the owners. Carefully and consistently managing vendors and their access, use, disclosure, and safeguarding of personal information is a critical part of any written information security program.

Whether it is facial recognition technology being used in connection with COVID-19 screening tools and in law enforcement, continued use of fingerprint-based time management systems, or the use of various biometric identifiers for physical security and access management, applications involving biometric identifiers and information in the public and private sectors continue to grow. Concerns about the privacy and security of that information continue to grow as well. Several states have laws protecting biometric information in one form or another, chief among them Illinois, but the desire for federal legislation remains.

Modeled after Illinois’s Biometric Information Privacy (BIPA), the National Biometric Information Privacy Act (Act), proposed by Sens. Jeff Merkley and Bernie Sanders, contains three key provisions:

  • A requirement to obtain consent from individual prior to collecting and disclosing their biometric identifiers and information.
  • A private right of action against entities covered by the Act that violate its protections which entitles aggrieved individuals to recover, among other things, the greater of (i) $1,000 in liquidated damages or (ii) actual damages, for negligent violations of the protections granted under the law.
  • An obligation to safeguard biometric identifier or biometric information in a manner similar to how the organization safeguards other confidential and sensitive information, such as Social Security numbers.

The Act would apply to “private entities,” generally including a business of any size in possession of biometric identifiers or biometric information of any individual. Federal, state, and local government agencies and academic institutions are excluded from the Act.

Under the Act, private entities would be required to:

  • Develop and make available to the public a written policy establishing a retention schedule and guidelines for permanently destroying biometric identifiers and biometric information. That schedule may not extend one year beyond an individual’s last interaction with the entity, but destruction could be required earlier;
  • Collect biometric identifiers or biometric information only when needed to provide a service to the individuals or have another valid business reason;
  • Inform individuals their biometric identifiers or biometric information is being collected or stored, along with the purpose and length of the collection, storage, or use, and must receive a written release from individuals which may not be combined with other consents, including an employment agreement;
  • Obtain a written release immediately prior to the disclosure of any biometric identifier or biometric information that includes the data to be disclosed, the reason for the disclosure, and the recipients of the data; and
  • Maintain the information using a reasonable standard of care.

Readers familiar with the BIPA in Illinois will find these requirements familiar. Readers familiar with the California Consumer Privacy Act (CCPA) will find the following “Right to Know” familiar as well. The Act would grant individuals the right to request certain information about biometric identifiers or biometric information collected by a covered entity within the preceding 12-month period. This information includes “specific pieces of personal information” and “the categories of third parties with whom the business shares the personal information.” The Act uses “personal information” but does not define it, leaving it unclear if it pertains only to biometric identifiers and biometric information.

Most troubling is the private right of action provision referenced above. The Act uses language similar to the language in the BIPA, which has led to a flood of class action litigation, including a decision by the IL Supreme Court finding plaintiffs need not show actual harm to recover under the law. The legislative process likely will result in some modification to the law, assuming it even survives, a fate privacy laws tend to have at the federal level. Nonetheless, we will continue to monitor the track of this and similar laws.

Pending legislation could create new consumer privacy rights in Massachusetts. Earlier this year, Senator Cynthia Creem presented An Act Relative to Consumer Data Privacy in the Massachusetts Senate. This Consumer Privacy Bill, SD.341, combines key aspects of the California Consumer Privacy Act (CCPA) and Illinois’s Biometric Information Privacy Act (BIPA). This bill would allow Massachusetts consumers a private right of action if their personal information or biometric information (referred to separately in the bill) is improperly collected.

The Consumer Privacy Bill defines “biometric information” as an individual’s physiological, biological or behavioral characteristics, including an individual’s DNA, that can be used, singly or in combination with each other or with other identifying data, to establish individual identity. Biometric information includes, but is not limited to, imagery of the iris, retina, fingerprint, face, hand, palm, vein patterns, and voice recordings, from which an identifier template, such as a faceprint, a minutiae template, or a voiceprint, can be extracted, and keystroke patterns or rhythms, gait patterns or rhythms, and sleep, health, or exercise data that contain identifying information.

The bill defines “personal information” as any information relating to an identified or identifiable consumer. “Personal information” means information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or the consumer’s device.

However, this definition does not include publicly available information or consumer information that is deidentified or aggregate consumer information. Moreover, the bill creates an exception for a business collecting or disclosing personal information of the business’s employees so long as the business is collecting or disclosing such information within the scope of its role as an employer. Therefore unlike California’s CCPA, where the application to employee data remains an open question, under the current text of the Massachusetts bill it is pretty clear that the law would not apply to employee data as defined above. That said, it is still early in the legislative process and the bill could be revised to include employee data.

The pending legislation would require businesses collecting a Massachusetts consumer’s personal information to notify the consumer of the following rights before the point of collection:

(1) The categories of personal information it will collect about that consumer;

(2) The business purposes for which the categories of personal information shall be used;

(3) The categories of third parties with whom the business discloses personal information;

(4) The business purpose for third party disclosure; and

(5) The consumer’s rights to request:

                  (A) A copy of the consumer’s personal information;

                  (B) The deletion of the consumer’s personal information; and

                  (C) Opt-out of third party disclosure.

In addition to this notice requirement, the bill would give consumers a statutory right to request that businesses collecting their personal information disclose to the consumer:

(1) The specific pieces of personal information the business has collected about that consumer;

(2) The sources from which the consumer’s personal information was collected;

(3) The names of third parties to whom the business disclosed the consumer’s personal information; and

(4) The business purpose for third party disclosure.

Businesses would have to make available to consumers two or more designated methods for submitting consumer verified requests for personal information, including, if the business maintains a web site, a link on the home page of the web site. A business receiving a verifiable consumer request generally must provide the requested information within 45 days of receiving the request, but may extend that period once by an additional 45 days, so long as the request for the extension is provided within the first 45-day period. The proposed legislation also creates a consumer right to request that a business delete any personal information collected from the consumer, and the right to opt out of third party disclosure at any time.

The legislation would be enforceable both through a private right of action and by the Massachusetts Attorney General. A consumer could recover damages in an amount not greater than $750 per consumer per incident or actual damages, whichever is greater (for any violation of the act); (2) injunctive or declaratory relief, and (3) reasonable attorney fees and costs. The Attorney General would be authorized to obtain a temporary restraining order or preliminary or permanent injunction against a violation of the Act. In addition, the Attorney General may seek a civil penalty of not more than $2,500 for each violation or $7,500 for each intentional violation.

This Consumer Privacy Bill would impose administrative burdens on businesses, including an obligation to train employees, as well as creating new exposure to damages and penalties. Given the litigation we are seeing under BIPA, businesses collecting Massachusetts consumers’ personal information should monitor the progress of this legislation to determine whether they should begin preparations for complying with yet another consumer privacy provision.