Earlier this month, the Federal Trade Commission (“FTC”) issued a report discussing “big data.” The report compiles the agency’s learning from recent seminars and research, including a public workshop held on September 15, 2014. Known best for its role as the federal government’s consumer protection watchdog, the FTC highlights in the report a number of concerns about uses of big data and the potential harms they may have on consumers. However, while the report’s focus is on the commercial use of big data involving consumer data, it also describes a number of issues raised when big data is employed in the workplace.

Used in the human resources context, big data has many useful applications such as helping companies to better select and manage applicants and employees. The FTC’s report describes a study which shows that “people who fill out online job applications using browsers that did not come with the computer . . . but had to be deliberately installed (like Firefox or Google’s Chrome) perform better and change jobs less often.” Applying this correlation in the hiring process can result in the employer rejecting candidates not because of factors that are job-related, but because they use a particular browser. Whether this would produce the best results for the company is unclear.

Likely spurred at least in part by comments made by EEOC counsel at the FTC’s big data workshop in September 2014, the FTC’s report summarizes the potential ways that using “big data” tools can violate existing employment laws, such as Title VII of the Civil Rights Act of 1964, the Age Discrimination in Employment Act, the American with Disabilities Act and the Genetic Information Nondiscrimination Act. The report also includes a brief discussion of “disparate treatment” or “disparate impact” theories, concepts familiar to many employers.

According to the report, facially neutral policies or practices that have a disproportionate adverse effect or impact on a protected class create a disparate impact, unless those practices or policies further a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact. Consider the application above. Use of a particular browser seems to be facially neutral, but some might argue that selection results based on that correlation can have a disparate impact on certain protected classes. Of course, as the FTC report notes with regard to other uses of big data – a fact-specific analysis will be necessary to determine whether a practice causes a disparate impact that violates law.

Two other concerns discussed in the FTC’s report that have workplace implications include:

  • Biases in the underlying data. Big data is about the collection, compilation and analysis of massive amounts of data. If hidden biases exist in these stages of the process, “then some statistical relationships revealed by that data could perpetuate those biases.” Yes, this means “garbage in, garbage out.” The report provides a helpful example: a company’s big data algorithm only considers applicants from “top tier” colleges to help them make hiring decisions. That company may be incorporating previous biases in college admission decisions. Thus, it is critical to understand existing biases in data as they could undermine the usefulness of the end results.
  • Unexpectedly learning sensitive information. Employers using big data can inadvertently come into possession of sensitive personal information. The report describes a study which combined data on Facebook “Likes” and limited survey information to determine that researchers could accurately predict a male user’s sexual orientation 88 percent of the time, a user’s ethnic origin 95 percent of time, and whether a user was Christian or Muslim 82 percent of the time. Clearly, exposure to this information could expose an employer to claims that its hiring decisions were based on this information, and not other legitimate factors.

Companies can maximize the benefits and minimize the risks of big data, according to the FTC report, by asking the following questions:

  • How representative is your data set?
  • Does your data model account for biases?
  • How accurate are your predictions based on big data?
  • Does your reliance on big data raise ethical or fairness concerns?

There certainly is much to consider before using big data technology in the workplace, or for commercial purposes. As big data applications become more widespread and cost efficient, employers may feel the need to use it to remain competitive. They will need to proceed cautiously, however, and understand the technology, the data collected and whether the correlations work and work ethically.

 

 

Print:
Email this postTweet this postLike this postShare this post on LinkedIn
Photo of Joseph J. Lazzarotti Joseph J. Lazzarotti

Joseph J. Lazzarotti is a principal in the Berkeley Heights, New Jersey, office of Jackson Lewis P.C. He founded and currently co-leads the firm’s Privacy, Data and Cybersecurity practice group, edits the firm’s Privacy Blog, and is a Certified Information Privacy Professional (CIPP)…

Joseph J. Lazzarotti is a principal in the Berkeley Heights, New Jersey, office of Jackson Lewis P.C. He founded and currently co-leads the firm’s Privacy, Data and Cybersecurity practice group, edits the firm’s Privacy Blog, and is a Certified Information Privacy Professional (CIPP) with the International Association of Privacy Professionals. Trained as an employee benefits lawyer, focused on compliance, Joe also is a member of the firm’s Employee Benefits practice group.

In short, his practice focuses on the matrix of laws governing the privacy, security, and management of data, as well as the impact and regulation of social media. He also counsels companies on compliance, fiduciary, taxation, and administrative matters with respect to employee benefit plans.

Privacy and cybersecurity experience – Joe counsels multinational, national and regional companies in all industries on the broad array of laws, regulations, best practices, and preventive safeguards. The following are examples of areas of focus in his practice:

  • Advising health care providers, business associates, and group health plan sponsors concerning HIPAA/HITECH compliance, including risk assessments, policies and procedures, incident response plan development, vendor assessment and management programs, and training.
  • Coached hundreds of companies through the investigation, remediation, notification, and overall response to data breaches of all kinds – PHI, PII, payment card, etc.
  • Helping organizations address questions about the application, implementation, and overall compliance with European Union’s General Data Protection Regulation (GDPR) and, in particular, its implications in the U.S., together with preparing for the California Consumer Privacy Act.
  • Working with organizations to develop and implement video, audio, and data-driven monitoring and surveillance programs. For instance, in the transportation and related industries, Joe has worked with numerous clients on fleet management programs involving the use of telematics, dash-cams, event data recorders (EDR), and related technologies. He also has advised many clients in the use of biometrics including with regard to consent, data security, and retention issues under BIPA and other laws.
  • Assisting clients with growing state data security mandates to safeguard personal information, including steering clients through detailed risk assessments and converting those assessments into practical “best practice” risk management solutions, including written information security programs (WISPs). Related work includes compliance advice concerning FTC Act, Regulation S-P, GLBA, and New York Reg. 500.
  • Advising clients about best practices for electronic communications, including in social media, as well as when communicating under a “bring your own device” (BYOD) or “company owned personally enabled device” (COPE) environment.
  • Conducting various levels of privacy and data security training for executives and employees
  • Supports organizations through mergers, acquisitions, and reorganizations with regard to the handling of employee and customer data, and the safeguarding of that data during the transaction.
  • Representing organizations in matters involving inquiries into privacy and data security compliance before federal and state agencies including the HHS Office of Civil Rights, Federal Trade Commission, and various state Attorneys General.

Benefits counseling experience – Joe’s work in the benefits counseling area covers many areas of employee benefits law. Below are some examples of that work:

  • As part of the Firm’s Health Care Reform Team, he advises employers and plan sponsors regarding the establishment, administration and operation of fully insured and self-funded health and welfare plans to comply with ERISA, IRC, ACA/PPACA, HIPAA, COBRA, ADA, GINA, and other related laws.
  • Guiding clients through the selection of plan service providers, along with negotiating service agreements with vendors to address plan compliance and operations, while leveraging data security experience to ensure plan data is safeguarded.
  • Counsels plan sponsors on day-to-day compliance and administrative issues affecting plans.
  • Assists in the design and drafting of benefit plan documents, including severance and fringe benefit plans.
  • Advises plan sponsors concerning employee benefit plan operation, administration and correcting errors in operation.

Joe speaks and writes regularly on current employee benefits and data privacy and cybersecurity topics and his work has been published in leading business and legal journals and media outlets, such as The Washington Post, Inside Counsel, Bloomberg, The National Law Journal, Financial Times, Business Insurance, HR Magazine and NPR, as well as the ABA Journal, The American Lawyer, Law360, Bender’s Labor and Employment Bulletin, the Australian Privacy Law Bulletin and the Privacy, and Data Security Law Journal.

Joe served as a judicial law clerk for the Honorable Laura Denvir Stith on the Missouri Court of Appeals.