According to a recent New York Times article, “Facebook scrambled on Monday to respond to a new and startling line of attack: accusations of political bias.” Slate followed with a report that the online social networking giant became the subject of a United States Senate inquiry, with Commerce Committee Chairman John Thune wanting information about how Facebook chooses stories for its “Trending” section, among other things. According to the reports, Facebook promotes its Trending section as an algorithmic tool that identifies the stories people using the site are most interested in at a given point in time, while former “curators” of the section tell a different story, that Facebook’s Trending section is a more subjective tool than users may realize.
Either way, the controversy raises an interesting issue – if Facebook’s Trending section is primarily driven objectively by algorithms (and not curators), could the algorithms be biased politically? If so, could algorithms used in other contexts also have embedded biases, albeit unintentional ones? If algorithms were deployed in the area of human resources, could conscious or unconscious bias undermine the employer’s desired results and violate existing employment laws, such as Title VII of the Civil Rights Act of 1964, the Age Discrimination in Employment Act, and the American with Disabilities Act?
We wrote about a recent FTC report discussing some of these concerns, including the potential for liability from uses of data analytics based on “disparate treatment” or “disparate impact” theories. We noted there that facially neutral policies or practices that have a disproportionate adverse effect or impact on a protected class create a disparate impact, unless those practices or policies further a legitimate business need that cannot reasonably be achieved by means that are less disparate in their impact.
Employers and their data scientists with appropriate counsel should consider these issues carefully to ensure their enormously powerful and valuable analytics programs produce reliable results with minimal legal risk.