Since the privacy and security regulations were issued under the federal Health Insurance Portability and Accountability Act (HIPAA), critics pointed to the limitations on the reach of those rules. A critical limitation advanced by privacy advocates is that the popular health data privacy rule extends only to certain covered entities and their business associates, not to health data generally. On April 17, 2022, Washington’s legislature passed House Bill 1155, also known as the My Health, My Data Act. The bill aims to address health data collected by entities not covered by HIPAA, including certain apps and websites.

If signed by the governor, most sections of the law would take effect on March 31, 2024, though certain parts of the legislation may take effect sooner.

When would the law apply?

A “regulated entity” for purposes of the law is defined as:

  • Conducts business in the State of Washington, or produces or provides products or services that are targeted to consumers in Washington, and
  • Alone or jointly with others, determines the purposes and means of collecting, processing, sharing, or selling consumer health data.

The legislation creates a subgroup of regulated entities, known as “small businesses,” largely to provide a few more months to comply. Small businesses are regulated entities that satisfy one or both of the following thresholds:

  • Collects, processes, sells, or shares consumer health data of fewer than 100,000 consumers during a calendar year; or,
  • Derives less than 50 percent of gross revenue from the collection, processing, selling, or shares of consumer health data and controls, processes, sells, or shares consumer health data of fewer than 25,000 consumers.

Who is protected by the law?

Under the legislation, a protected consumer is defined as a natural person who is a Washington resident or a natural person whose consumer health data is collected in Washington.

A consumer is only protected for actions taken as an individual or on behalf of a household and does not include actions taken by an individual acting in an employment context.

What data is protected by the law?

The law would protect “consumer health data,” defined as personal information that is linked or reasonably linkable to a consumer and that identifies the consumer’s past, present, or future physical or mental health status. Health status includes but is not limited to the following:

  • Individual health conditions, treatment, diseases, or diagnosis
  • Social, psychological, behavioral, and medical interventions
  • Health-related surgeries or procedures
  • Use or purchase of prescribed medications
  • Bodily functions, vital signs, symptoms, or measurements of health-related functions
  • Diagnoses or diagnostic testing, treatment, or medication
  • Gender-affirming care information
  • Reproductive or sexual health information
  • Biometric data
  • Genetic data
  • Precise location information that could reasonably indicate a consumer’s attempt to acquire or receive health services and supplies
  • Data that identifies a consumer seeking health care services.

What are the rights of consumers?

Under HIPAA, individuals have several rights with respect to their protected health information (PHI). These rights include the right to authorize disclosures in certain contexts (and revoke those authorizations), to request an amendment, to request an accounting of disclosures, to request a restriction on use and disclosure, and to be notified of a breach. The Washington legislation would provide consumers with the right to:

  • Confirm whether their consumer health data is being collected, shared, or sold, including a list of all third parties and their affiliates to whom the data has been shared and their contact information.
  • Consent to or deny collection or sharing of health data.
  • Withdraw consent from a regulated entity or small business to collect or share health data.
  • Delete health data collected by a regulated entity or small business, including on archived or backup systems.
  • Be provided clear and conspicuous disclosure of rights to consent or deny collection or sharing of health data.

The provisions concerning the administration of these rights look a lot like the provisions in the California Consumer Privacy Act (CCPA) and other recently enacted state comprehensive data privacy laws.

What obligations do businesses have?

The Washington law would add to the growing compliance burden on company websites as it would require regulated entities and small businesses to maintain a consumer health data privacy policy prominently on their homepages. That policy must that clearly and conspicuously disclose:

  • Categories of consumer health data collected and the purpose for which the data is collected.
  • Categories of sources from which the consumer health data is collected
  • Categories of consumer health data that are shared.
  • A list of the categories of third parties and specific affiliates with whom consumer health data is shared.
  • How a consumer can exercise the rights provided under the law.

This too is very similar to obligations under the CCPA. Regulated entities and small businesses may not discriminate against a consumer for exercising any rights included under the law. They also must respond to requests from consumers to withdraw consent to collect or share health data. Moreover, they must respond to requests from consumers to delete their consumer health data. The law also would mandate contracts be in place with processors of consumer health data and codify specific data security obligations for regulated entities and small businesses, including specific access management requirements.

Additionally, the law would make it unlawful for “any person” (apparently not just regulated entities or small businesses) to implement a geofence around an entity that provides in-person health care services where such geofence is used to: (1) Identify or track consumers seeking health care services; (2) collect consumer health data from consumers; or (3) send notifications, messages, or advertisements to consumers related to their consumer health data or health care services.

How is the law enforced?

Under the new legislation, violations of the requirements for health care data would be enforceable either by the prosecution by the State’s Attorney General’s Office or by private actions brought by affected consumers.

For additional information on Washington’s new privacy statute and other data privacy laws and regulations, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

On March 21, 2023, Virginia’s governor approved Senate Bill 1040, which prohibits an employer from using an employee’s social security number or any derivative as an employee’s identification number. The bill also prohibits including an employee’s social security number or any number derived from the social security number on any identification card or badge.

An employer who knowingly violates the new law may be subject to a civil penalty not to exceed $100 for each violation. However, the employer shall be provided notice of the violation by the state Commissioner and the employer can request an informal conference regarding the violation.

The bill takes effect on July 1, 2023.

Virginia joins other states with similar prohibitions such as in New York and under federal law.  

If you have questions about Virginia’s bill or the protection of employees’ social security numbers, contact a Jackson Lewis attorney to discuss.

The Indiana Legislature is poised to pass Senate Bill 5, a comprehensive privacy statute (the “Act”), and send it on to the Governor. Once signed, the Act will become operative on January 1, 2026, and make Indiana the seventh state, after California, Colorado, Connecticut, Iowa, Utah, and Virginia to enact a comprehensive consumer privacy statute.

Key Elements

Similar to the Colorado Privacy Act (CPA) and the Virginia Consumer Data Privacy Act (VCDPA), the Act was modeled in part on the CCPA, CPRA, and the EU General Data Protection Regulation (GDPR). But there are some variations. Key elements of the UCPA include:

When does the Act apply? The Act applies to persons that conduct business in Indiana or produce products or services that are targeted to residents of the state and that, during a calendar year:

  • Control or process personal data of at least 100,000 consumers who are residents of the state, or
  • Control or process personal data of at least 25,000 consumers who are residents of the state and derive more than 50% of gross revenue from the sale of personal data.

Are there exemptions? Among the persons not subject to the Act include Indiana and state agencies, third-party contractors of the state and such agencies acting on their behalf (but only with respect to such contracts), financial institutions, HIPAA-covered entities and business associates, not-for-profit organizations, institutions of higher education, and public utilities.

Who is protected under the Act? The Act protects the personal information of a “consumer,” defined as an individual who:

  • Is a resident of the state, and
  • Is acting only for personal, family, or household purposes.

Like the recently passed Iowa statute, Indiana excludes individuals acting in a commercial or employment context from its definition of consumer.

What “personal data” is protected under the Act? Under the Act, personal data is defined broadly as information that is linked or reasonably linkable to an individual. The definition excludes de-identified data, aggregate data, or publicly available information.

What rights do consumers have under the Act? The Act provides consumers with the following rights:

  • The right to request confirmation of whether a business is processing their personal data and related information.
  • The right to access their personal data upon request.
  • The right to correct information a company possesses
  • The right to delete personal information obtained by businesses
  • The right to opt out of the processing of personal data for purposes of targeted advertising, sale of personal data, or certain profiling activities.

The rules surrounding the administration of these rights pull from similar language in the other state privacy laws – a 45-day period to respond, a verification requirement, and a right to appeal a controller’s adverse decision concerning a consumer right request.

What obligations do covered persons have?

The Act lays out a list of obligations for controllers which generally track the laws in the other states. Without limitation, controllers must:

  • limit the collection of personal data to what is adequate, relevant, and reasonably necessary in relation to the purposes for which such data is processed,
  • establish, implement, and maintain reasonable administrative, technical, and physical security practices to protect the confidentiality, integrity, and accessibility of personal data,
  • not discriminate against a consumer for exercising rights under the Act,
  • not process sensitive data without the consumer’s consent,
  • provide consumer with a privacy notice that explained among other things the categories of personal data the controller processes and shares with third parties, and
  • provide consumers the opportunity to opt out of the sale of personal data and explain the means to exercise these and other rights under the Act.

For processing activities created or generated after December 31, 2025, controllers need to conduct and document impact assessments for certain processing activities, such as the sale of personal data and the processing of sensitive data. In short, these assessments must weigh the benefits of the processing and the risks to the consumer, considering risk mitigation efforts by the controller.

With respect to processors, the Act requires they adhere to the instructions of controllers, such as assisting the controller with responding to consumer requests. Contracts between controllers and processors must include certain provisions, such as instructions for processing personal data, the nature and duration of the processing. Other required provisions include (i) a requirement for processors to make available all information in the processor’s possession to demonstrate the processor’s compliance with the Act, (ii) cooperating with reasonable assessments of compliance by the controller (or arrange for a qualified and independent assessor), and (iii) obligating the processor to push the Act’s required provisions down to the processor’s subcontractors 

How is the law enforced, any private right of action? Unlike the CCPA, Indiana’s statute does not include a private right of action for consumers. In fact, the Act states that “[n]othing in [the Act] shall be construed as providing the basis for a private right of action for violations of this article or any other law.” Instead, the state attorney general will have exclusive enforcement authority. Businesses that are found to have violated the law may face fines of up to $7,500 per violation.

For additional information on Indiana’s new privacy statute and other data privacy laws and regulations, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

On April 6, 2023, the New York City Department of Consumer and Worker Protection (“Department”) issued its Final Rules regarding automated employment decision tools (“AEDT”). As previously reported, New York City’s AEDT law, Local Law 144 of 2021, prohibits employers and employment agencies from using AEDT unless:

  • The tool has been subjected to a bias audit within a year of the tool being used or implemented;
  • Information about the bias audit is made publicly available; and,
  • Certain written notices have been provided to employees or job candidates.

Read the full article on Jackson Lewis’ Data Intelligence Reporter.

With advances in technology and business marketing come changes in the law and new litigation. Many businesses are familiar with the federal Telephone Consumer Protection Act (TCPA) but may be less familiar with Florida’s version, the Florida Telephone Solicitation Act (FTSA). A recent wave of class-action lawsuits stems from a 2021 amendments to the FTSA, largely focusing on businesses utilizing phone calls and text messages to advertise their products and services. The following examines nuances of the FTSA and why the measures businesses may put in place to comply with the TCPA may not pass muster in Florida.

Difference Between the TCPA and FTSA

The critical difference between the TCPA (as currently interpreted by the Supreme Court) and the FTSA is set forth in Fla. Stat. §501.059(8)(a).  Specifically, 8(a) provides: A person may not make or knowingly allow a telephonic sales call to be made if such call involves an automated system for the selection or dialing of telephone numbers or the playing of a recorded message when a connection is completed to a number called without the prior express written consent of the called party.

Under both the TCPA and the FTSA violations require the use of automatic equipment. However, while the Supreme Court has clarified that to qualify as an “automatic telephone dialing system” under the TCPA, a device must have the capacity either to store or to produce, a telephone number using a random or sequential number generator, the FTSA does not currently include a similarly limited definition. Rather, the FTSA’s lack of a definition has opened the door for plaintiffs to argue that any automated system that dials numbers or selects the order in which numbers are dialed would fit within the statute. In short, if a business’s automatic equipment dials from a list, it will likely not implicate the TCPA but still may create risk under the FTSA.  This is a critical point as in today’s technological environment, it is far more likely for entities to utilize a list for outreach rather than a random or sequential number generator. 

In addition, the FTSA’s 2021 amendments added a requirement to obtain prior express written consent for such telephonic sales calls. The amendment also includes the elements prior express written consent should contain. 

The FTSA’s Critical Components

As noted, the FTSA prohibits all telemarketing sales calls, text messages, and direct-to-voicemail, also known as “ringless voicemail” messages, using an “automated system for selection or dialing of telephone numbers or playing of a recorded message” without prior express written consent.  In other words, it is against the law for a company to utilize automated telephone dialing systems or pre-recorded messages in all telemarketing sales calls, text messages, or direct-to-voicemail messages without the prior express written consent of the individual receiving the call. 

Consumers Covered

The law itself covers only Florida residents. The FTSA contains a rebuttable presumption that a call made to any Florida area code is made to a Florida resident or a person in Florida at the time of the call. The FTSA has been enforced against businesses located and incorporated outside Florida. While there may be legal challenges on this issue, it is imperative that out-of-state businesses ensure their communications to Florida residents comply with the FTSA.

Private Right of Action

Under the FTSA’s private right of action, any violation for automated calls and do-not-call violations allows for a right to recover $500 in statutory damages. The FTSA also provides for up to $1500 in treble damages for willful or knowing violations in damages, plus attorneys’ fees.

Similar to the TCPA, the FTSA’s private right of action is not limited to automated calls but includes other violations such as calls to persons registered on Florida’s do-not-call list or making calls that fail to transmit the caller’s originating telephone number.

Based on the unique structure and provisions of the FTSA, in the past year, businesses have faced a voluminous number of FTSA claims, normally pled on a class basis. This means that an individual who brings an action claiming that he or she did not provide prior written consent to receive an automated text message or phone call brings the action individually and on behalf of all Florida residents who also received texts or calls from the same business dating back to the amendment to the FTSA in 2021. As such, potential damages for a thousand-person putative class can quickly climb to six or seven figures. Thus, compliance is key.

Currently, defendants challenge the amendments to the FTSA by arguing that the Act is unconstitutional, plaintiffs do not have standing or actual harm, the cases should not be treated as class actions, and a myriad of other arguments. While these legal challenges continue, companies across the nation that conduct business and marketing in Florida should ensure their consents and practices comply with this law.

If you have questions regarding FTSA or related issues please reach out to a member of our Privacy, Data, and Cybersecurity practice group to discuss.

Websites play a vital role for organizations. They facilitate communication with consumers, constituents, patients, employees, donors, and the general public. They project an organization’s image and promote goodwill, provide information about products and services and allow for their purchase. Websites also inform investors about performance, enable job seekers to view and apply for open positions, and accept questions and comments from visitors to the site or app, among many other activities and functionalities. Because of this vital role, websites have become an increasing subject of regulation making them a growing compliance concern.

Currently, many businesses are working to become compliant with the California Consumer Privacy Act (“CCPA”) which, if applicable, requires the conspicuous posting of a detailed privacy policy on a business’s website. But, the CCPA is not the first nor will it be the last compliance challenge for organizations that operate websites and other online services. An growing compliance burden has led to a wide range of operational and content requirements for websites. The push for CCPA compliance and responding to the flood of ADA accessibility litigation may cause more organizations to revisit their websites and, in the process, uncover a range of other issues that have crept in over the years.

What are some of these requirements?

AI – Artificial Intelligence. Organizations are increasingly leveraging automated decision-making tools to enhance their businesses in a range of areas, including employment. Needless to say, artificial intelligence (AI) and similar technologies, which power these tools, is being targeted for regulation. For example, the New York City Council passed a measure that subjects the use of automated decision-making tools to several requirements. One of those requirements is a “bias audit.” Employers that intend to utilize such a tool must first conduct a bias audit and must publish a summary of the results of that audit on their websites. We cover more about NYC Local Law 144 here.

ADA Accessibility. When people think about accommodating persons with disabilities, they often are drawn to situations where a person’s physical movement in a public place is impeded by a disability – stairs to get into a library or narrow doorways to use a bathroom. Indeed, Title III of the Americans with Disabilities Act grants disabled persons the right to full and equal enjoyment of the goods, services, facilities, privileges, advantages, or accommodations of any place of public accommodation. Although websites were not around when the ADA was enacted, they are now, and courts are applying ADA protections to those sites. The question is whether a website or application is accessible.

Although not yet adopted by the Department of Justice, which enforces Title III of the ADA, guidelines established by the Website Accessibility Initiative appear to be the more likely place courts will look to access the accessibility of a website to which Title III applies. State and local governments have similar obligations under Title II of the ADA, and those entities might find guidance here.

HIPAA…and tracking technologies, pixels. For anyone who has had their first visit to a doctor’s office, they likely were greeted with a HIPAA “notice of privacy practices” and asked to sign an acknowledgement of receipt. Most covered health care providers have implemented this requirement, but may not be aware of the website requirement. HIPAA regulation 45 CFR 164.520(c)(3)(i) requires that covered entities maintaining a website with information about the entity’s customer services or benefits must prominently post its notice of privacy practices on the site and make the notice available electronically through site.

Beyond the notice posting requirement, websites of HIPAA covered entities and business associates have operational issues to consider. In December 2022, the U.S. Department of Health and Human Services (HHS) Office for Civil Rights (OCR) issued a bulletin with guidance concerning the use of online tracking technologies by covered entities and business associates under HIPAA. The OCR Bulletin follows a significant uptick in litigation concerning these technologies in industries including but not limited to the healthcare. For healthcare entities, the allegations relate to the sharing of patient data obtained from patient portals and websites. We do a deeper dive into this issue here.

COPPA. The Children’s Online Privacy Protection Act (COPPA) was enacted to give parents more control concerning the information websites collect about their children under 13. Regulated by the Federal Trade Commission (FTC), COPPA requires websites and online services covered by COPPA to post privacy policies, provide parents with direct notice of their information practices, and get verifiable consent from a parent or guardian before collecting personal information from children. COPPA applies to websites and online services directed to children under the age of 13 that collect personal information, and to sites and online services geared toward general audiences when they have “actual knowledge” they are collecting information from children under 13. Find out more about compliance here.

FTCA and more on tracking technologies. Speaking of the FTC, that agency also enforces the federal consumer protection laws, including section 5 of the Federal Trade Commission Act (FTCA) which prohibits unfair and deceptive trade practices affecting commerce. When companies tell consumers they will safeguard their personal information, including on their websites, the FTC requires that they live up these promises. Businesses should review their website disclosures to ensure they are not describing privacy and security protections that are not actually in place.

Further to the issue of website tracking technologies noted above under HIPAA, the FTC took enforcement action against digital healthcare companies for sharing user information vie third-party tracking pixels, which enable the collection of user data. However, the FTC’s new focus highlights that issues with pixel tracking are not only a concern for covered entities and business associates under HIPAA.

ACA – Transparency in Coverage. Pursuant to provisions in the Consolidated Appropriations Act, 2021, the Departments of Labor, Health and Human Services, and the Treasury issued regulations to implement the Transparency in Coverage Final Rules.  The Final Rules require certain health plans and health insurance issuers to post information about the cost to participants, beneficiaries, and enrollees for in-network and out-of-network healthcare services through machine-readable files posted on a public website.  The Final Rules for this requirement are effective for plan years beginning on or after January 1, 2022 (an additional requirement for disclosing information about pharmacy benefits and drug costs is delayed pending further guidance).

Comprehensive State Privacy Laws, including the CCPA. As mentioned above, a CCPA-covered business that maintains a website must post a privacy policy on its website through a conspicuous link on the home page using the word “privacy,” on the download or landing page of a mobile application. That is not all. The website must also provide certain mechanisms for consumers (including employees and applicants) to contact the business about their CCPA rights, such as the right to require deletion of their personal information, and the right to opt-out of the sale of personal information. The latter must be provided through an interactive webform accessible via a clear and conspicuous link titled “Do Not Sell My Personal Information,” or “Do Not Sell My Info.” Several of these requirements have been enhanced beginning in 2023 under the California Privacy Rights Act.

Since we originally published this post, five other states have enacted a similar data privacy framework – Colorado, Connecticut, Iowa, Utah, and Virginia. For organizations subject to those law, additional work may be needed on their privacy policies to comply.

CalOPPA. Even if an organization is not subject to the CCPA, it still may be subject to the California Online Privacy Protection Act (CalOPPA). CalOPPA requires certain commercial operators of online services, including websites and mobile and social apps, which collect personally identifiable information from Californians to conspicuously post a privacy policy. Privacy policies should address how companies collect, use, and share personal information. Companies can face fines of up to $2,500 each time a non-compliant app is downloaded.

Delaware and Nevada. In 2016, Delaware became the second state to have an online privacy protection act, requiring similar disclosures to those under CalOPPA. Nevada enacted website privacy legislation of its own. First, like DelOPPA and CalOPPA, NRS 603A.340 requires “operators” to make a privacy notice reasonably accessible to consumers through its Internet website or online service. That notice must, among other things, identify the categories of covered information the operator collects through the site or online service about consumers who use or visit the site or service and the categories of third parties with whom the operator may share such covered information. In general, an operator is a person who: (i) owns or operates an Internet website or online service for commercial purposes; (ii) collects and maintains covered information from consumers who reside in this State and use or visit the Internet website or online service; and (iii) engages in any activity that constitutes sufficient nexus with this State, such as purposefully directing its activities toward Nevada. Effective October 1, 2019, Nevada added to its website regulation by requiring operators to designate a request address on their websites through which a consumer may submit a verified request to opt out of the sale of their personal information.

California’s Fair Chance Act. This is California’s version of the “ban the box” law, those law enacted in many states which generally prohibit employers from asking job applicants about criminal convictions before making a conditional job offer. In California, the law imposes similar restrictions on employers with five or more employees. Why is this a website requirement?

Recently, the state’s Department of Fair Employment and Housing (DFEH) announced new efforts to identify and correct violations of the statute by using technology to conduct mass searches of online job advertisements for potentially prohibited statements. The DFEH deems blanket statements in job advertisements indicating that the employer will not consider anyone with a criminal history to be violative of the statute. In its press release, the DFEH states in one day of review it found over 500 job advertisements with statements that violate the statute. The DFEH has released a new Fair Chance Toolkit, that includes sample forms and guides, and employers should also consider reviewing the descriptions of job opportunities on their websites.

California Transparency in Supply Chains Act. California seeks to curb slavery and human trafficking by making consumers and businesses more aware that the goods and products they buy could be supporting the commission of these crimes. To do so, the Transparency in Supply Chains Act requires large retailers and manufacturers to provide consumers with information regarding their efforts to eradicate slavery and human trafficking from their supply chains. This information must be conspicuously provided on the company’s website (or provided in writing if it does not have a website). To be subject to the law, a company must (a) identify itself as a retail seller or manufacture in its tax returns; (b) satisfy the legal requirements for “doing business” in California; and (c) have annual worldwide gross receipts exceeding $100,000,000. To assist with compliance, the state has published a Resource Guide and Frequently Asked Questions.

GDPR. In 2018, the European Union’s General Data Protection Regulation (GDPR) became effective in 2018 and reached companies and organizations globally. In general, organizations subject to the GDPR which collect personal data on their websites must post a privacy policy on their website setting for the organization’s privacy practices.

Not-For-Profits, Donors, and Ratings. A donor’s decision to contribute to an organization is significantly affected by that organization’s reputation. To assist donors, several third-party rating sites, such as Charity Navigator, the Wise Giving Alliance, and CharityWatch, do much of the legwork for donors. They collect large amounts of data about these organizations, such as financial position, use of donated funds, corporate governance, transparency, and other practices. They obtain most of that data from the organizations’ Forms 990 and websites, where many organizations publish privacy policies.

Rating sites such as Charity Navigator base their ratings on comprehensive methodologies. A significant component of Charity Navigator’s rating, for example, relates to accountability and transparency, made up of 17 categories. A review of an organization’s website informs five of those 17 categories, namely (i) board members listed, (ii) key staff listed, (iii) audited financials published, (iv) Form 990 published, and (v) privacy policy content. Addressing some of these issues on an organization’s website could help boost its ratings and drive more contributions.

This is by no means an exhaustive list of the regulatory requirements that may apply to your website or online service. Organizations should regularly revisit their websites not just to add new functionality or fix broken links. They should have a process for ensuring that the sites or services meet the applicable regulatory requirements.

As we round the corner into the second quarter of 2023, the following enforcement dates for new or amended state data protection laws are quickly approaching.

  • The New York City Local Law 144, Automated Employment Decision Tools: April 15, 2023.
  • California Consumer Privacy Act Regulations: July 1, 2023.
  • Colorado Consumer Privacy Act (CPA): July 1, 2023.
  • Connecticut Act Concerning Personal Data Privacy and Online Monitoring (CTDPA): July 1, 2023.
  • Virginia SB 1040 – Employer use of use of employee’s social security number: July 1, 2023.

Depending on the law, preparation may require reviewing data collection and use, updating notices, internal policies and procedures, and conducting employee training.

If you have questions about data protection laws, cybersecurity, or related issues, contact a Jackson Lewis attorney to discuss.

Last week, a New York Times’ article discussed ChatGPT and AI’s “democratization of disinformation,” along with their potentially disruptive effects on upcoming political contests. Asking a chatbot powered by generative AI to produce a fundraising email is not the main concern, according to the article. Leveraging that technology to create and disseminate disinformation and deepfakes is. Some of the tactics described in the article intended to further political goals are unsettling for and well beyond politics, including the workplace.

“Now any amateur with a laptop can manufacture the kinds of convincing sounds and images that were once the domain of the most sophisticated digital players. This democratization of disinformation is blurring the boundaries between fact and fake…”

Voice-cloning tools could be used, for example, to create convincing audio clips of political figures. One clip might convey a message that is consistent with the campaign’s platform, albeit never uttered by the candidate. Another clip might be produced to position the candidate in a bad light by suggesting the candidate was involved in illicit behavior or conveyed ideas damaging to the campaign, such as using racially-charged language. Either way, such clips would be misleading to the electorate. The same would be true of AI-generated images or videos.

And as synthetic media gets more believable, the question becomes: What happens when people can no longer trust their own eyes and ears?”

It’s not hard to see how these same technologies, which are increasingly accessible by most anyone and relatively easy to use, can create significant disruption and legal risk in workplaces across the country. Instead of creating a false narrative about a political figure, a worker disappointed in his annual review might generate and covertly disseminate a compromising “video” of his supervisor. The failure to investigate a convincing deepfake video could have substantial and unintended consequences. Of course, the creation of this kind of misinformation can be directed at executives and the company as a whole.

Damaging disinformation and deepfakes are not the only risks posed by generative AI technologies. To better understand the kinds of risks an organization might face, assessing how workers are using ChatGPT and other similar generative AI technologies is a good first step. If a group of workers are like the millions of other people using ChatGPT, activities might include performing research, preparing draft communications such as the fundraising email in the NYT article discussed above, coding, and other tasks. Workers in different industries with different responsibilities likely will be approaching the technology with different needs and identifying a range of creative use cases.

Greater awareness about the uses of generative AI in an organization can help with policy development, but there are some policies that might make sense for most if not all applications of this technology.

Other workplace policies generally apply. As good example of this is harassment and nondiscrimination policies. As with an employee’s activity in social media, an employee’s use of ChatGPT is not shielded from existing policies on discrimination or harassment of others. Existing policies should apply.

Follow the application’s terms and understand its limitations. Using online resources for company business in violation of the terms of use of those resources could create legal exposure for organizations. Also, employees should be aware of the capabilities and limitations of the tools they are using. For instance, while ChatGPT may seem omniscient, it is not, and it may not be up to date – OpenAI notes “ChatGPT’s training data cuts off in 2021.” Employees can avoid a little embarrassment for the organization (and themselves) knowing this kind of information.

Avoid impermissible sharing of data. ChatGPT is just that, a chat or conversation with OpenAI that employees at OpenAI can view:

Who can view my conversations?

As part of our commitment to safe and responsible AI, we review conversations to improve our systems and to ensure the content complies with our policies and safety requirements.

Employees should avoid sharing personal information as well as confidential information about the company or its customers without understanding the applicable obligations that may apply. For example, there may be contractual obligations to customers of the organization prohibiting the sharing of their confidential information with third parties. Similar obligations could be established through website privacy policies or statements through which an organization has represented how it would share certain categories of information.

Establish a review process to avoid improper uses. Information generated through AI-powered tools and platforms may not be what it seems. It may be inaccurate, incomplete, biased, or it may infringe on another’s intellectual property rights. The organization may want to conduct a review of certain content obtained through the tool or platform to avoid subpar service to customers or an infringement lawsuit.

There is a lot to think about when considering the impacts of ChatGPT and other generative AI technologies. This includes carefully wading through political blather during the imminent election season. It also includes thinking about how to minimize risk related to these technologies in the workplace. Part of that can be accomplished through policy, but there are other steps to consider, such as employee training, monitoring utilization, etc.

On March 28, 2023, Iowa’s Governor signed Iowa’s new statute relating to consumer data protection. Iowa joins CaliforniaColoradoConnecticutUtah, and Virginia in the ever-growing patchwork of consumer privacy laws across the country.

The new law takes effect on January 1, 2025.

Iowa’s consumer privacy law covers businesses that control or process personal data on 100,000 consumers in the state or derive 50% of their revenue from selling the data of more than 25,000 consumers. A consumer under Iowa’s statute is defined as a natural person who is a resident of the state and active in an individual or household context. Individuals acting in a commercial or employment context are excluded.

Like other states’ comprehensive consumer privacy laws, the statute provides consumers with the right to access personal data being processed, to delete personal data, and to opt out of the sale of their personal data.

 For additional information on Iowa’s new privacy statute and other data privacy laws and regulations, please reach out to a member of our Privacy, Data, and Cybersecurity practice group.

The Federal Trade Commission (FTC) recently took enforcement action against digital healthcare companies for sharing user information vie third-party tracking pixels, which enable the collection of user data. At the start of the year, the U.S. Health and Human Services Office of Civil Rights issued its own bulletin with guidance regarding tracking pixel technology for covered entities and business associates subject to the Health Insurance Portability and Accountability Act (HIPAA). However, the FTC’s new focus highlights that issues with pixel tracking are not only a concern for covered entities and business associates under HIPAA.

The following definition of pixel tracking from the FTC is helpful:

Tracking pixels have evolved from tiny, pixel-sized images on web pages for tracking purposes to include a broad range of HTML and JavaScript embedded in web sites (and email). Tracking pixels can be hidden from sight and can track and send all sorts of personal data such as how a user interacts with a web page including specific items a user has purchased or information users have typed within a form while on the site. Businesses often want to use them to track consumer behavior (pageviews, clicks, interactions with ads) and target ads to users who may be more likely to engage or purchase something based on that prior online behavior.

In its recent article about pixel tracking, the FTC discusses concerns about the practice:

  • Ubiquity and persistence. The FTC cited to significant research indicating that thousands of the most visited websites have pixels potentially leaking personal information to third parties. And, unlike cookies which can be disabled, “[p]ixel tracking can still occur even if cookies are disabled.”
  • Lack of clarity. The technology permits any kind of data to be shared and in some cases the providers of the technology are not sure what data is being shared. This can leave consumers in the dark about the categories of their personal information shared with third parties as a result of their activity on a website.
  • Steps to remove personal information may be ineffective. The agency notes that some attampts to appropriately remove personal information may be inadequate. As an example, while some pixel technologies attempt to “hash” personal information to scramble personally identifiable information, that scambling can be reversed.

The concerns raised by the FTC are more general than just HIPAA and go to consumer privacy and data protection. For example, the FTC observed:

Companies using tracking pixels that impermissibly disclose an individual’s personal information (which may include health information) to third parties may be violating the FTC Act, the FTC’s Health Breach Notification Rule, the HIPAA Privacy, Security, and Breach Notification Rules, other state or federal statutes involving the disclosure of personal information, and your privacy promises to consumers.

As such, even companies outside of healthcare need to consider their use of pixel technology to ensure compliance with state and federal laws on the protection of consumer data. And, in particular, businesses need to consider what promises they are making to consumers, such as in their website privacy policies and terms of use.  

If questions about compliance with consumer privacy and data protection or related issues, contact a Jackson Lewis attorney to discuss.