May a rejected applicant sue an outside vendor directly for employment discrimination where the vendor’s artificial intelligence platform embedded on a prospective employer’s website executes unlawfully biased screening based on race or other protected characteristics?
A California federal judge recently said yes under federal anti-discrimination laws. A 2023 California Supreme Court ruling suggests the answer may be the same under California’s Fair Employment and Housing Act (FEHA). Here’s why.
Judge holds AI provider may be liable
In Mobley v. Workday Inc., Judge Rita Lin rejected human resource management services provider Workday’s motion to dismiss a complaint brought by Derek Mobley, a Black man over the age of 40 who suffers from depression and anxiety. Among other things, Mobley claimed he performed poorly on Workday branded assessment and personality tests due to his mental disabilities.
Mobley unsuccessfully applied for over 100 jobs with companies that all used a Workday AI screening tool embedded on the companies’ websites. Mobley alleged the bias in Workday’s AI algorithms kept him from advancing in the hiring process at any of those companies.
Lin ruled that Workday may be liable for employment discrimination under federal anti-discrimination laws — separate from the companies to which Mobley applied (which he did not sue) — if the allegations are proven.
Discrimination statutes define “employer” to include employer’s agent
Title VII of the Civil Rights Act of 1964, the Age Discrimination in Employment Act, and the Americans with Disabilities Act include within the definition of a covered “employer” any “agent of the employer.”
Lin reasoned that Workday could be liable for discrimination because its client-employers delegated to the company, through Workday’s AI tools, a key role in hiring, a function an employer traditionally exercises. Workday’s tools effectively decide which candidates advance beyond the initial screening.
“Given Workday’s allegedly crucial role in deciding which applicants can get their ‘foot in the door’ for an interview, Workday’s tools are engaged in conduct that is at the heart of equal access to employment opportunities.”
Lin said it didn’t matter that Workday performed this initial screening function with an embedded algorithm rather than with humans. “Drawing an artificial distinction between software decisionmakers and human decisionmakers,” she wrote, could “gut anti-discrimination laws in the modern era.”
Like Title VII, FEHA defines “employer” to include “any person acting as an agent of an employer, directly or indirectly.” Mobley, however, did not assert a claim against Workday as an agent of the 100+ employers that rejected him. Instead, Mobley alleged Workday violated a separate FEHA provision prohibiting anyone from aiding or abetting unlawful discrimination. Judge Lin dismissed Mobley’s FEHA claim because Mobley did not allege any particular prospective employer discriminated against him and that Workday knew of any prospective employer’s unlawful discrimination.
California Supreme Court rules in 2023 that business entity agent performing employer’s hiring function may be liable for discrimination under FEHA
Last year, in Raines v. U.S. Healthworks Medical Group, the California Supreme Court addressed whether an AI vendor or other business entity could be liable under FEHA to rejected applicants where the vendor performed an employment decision-making function with unlawful bias.
The defendants in Raines were prospective employer Front Porch and USHW, which provided a medical screening questionnaire that Front Porch required plaintiff-applicants to complete. Front Porch allegedly used the result of that questionnaire, which allegedly unlawfully elicited private health-related information, to revoke conditional job offers.
The California Supreme Court, guided by federal rulings on which Lin relied, concluded a business entity acting as an employer’s agent may be held directly liable as an employer for employment discrimination under FEHA “when the business-entity agent has at least five employees and carries out FEHA-regulated activities (such as hiring) on behalf of an employer.”
The state high court observed that such an entity probably could “bear the cost of legal counsel to ensure that its policies and methods” were lawful and could even include clauses in client agreements requiring employer-clients to indemnify against potential FEHA liability.
Proposed California regulation of AI workplace decision-making tools
Meanwhile, the California Civil Rights Council, the rule-making arm of the Civil Rights Department, at a July 18 hearing continued to consider adopting regulations that would “clarify” the definition of “agent” and “employment agency” under FEHA to include a third party that provides automated-decision systems for an employer to use to screen applicants and other decision-making processes.
California business groups argued the proposed rule would expose businesses not engaged in employment decision-making to FEHA liability and effectively negate the benefits of using AI tools.
Eaton is a partner with the San Diego law firm of Seltzer Caplan McMahon Vitek where his practice focuses on defending and advising employers. He also is an instructor at the San Diego State University Fowler College of Business where he teaches classes in business ethics and employment law. He may be reached at eaton@scmv.com.
Originally Published: July 29, 2024 at 5:00 a.m.