The EEOC’s Artificial Intelligence Guidelines and the Risk of Class Action Litigation

Fox Rothschild LLP
Contact

Fox Rothschild LLP

On May 12, 2022, the United States Equal Employment Opportunity Commission (“EEOC”) published its first guidelines regarding the use of artificial intelligence in employment, titled The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees.  While this technology may streamline the job application process and other employment decisions, the EEOC has raised concerns that it also may lead to widespread discrimination against legally-protected individuals.  More recently, in January 2023, the EEOC held a hearing on use of technology in employment decisions, indicating that the EEOC is devoting significant attention to this subject matter.

Disability Discrimination

The EEOC guidelines identify three common ways that an employer’s use of artificial intelligence (“AI”) and algorithmic decision-making tools may violate the Americans with Disabilities Act (“ADA”):

  • Reasonable Accommodations.  The ADA requires employers to provide reasonable accommodations to job applicants and employees with a disability, unless doing so would cause an undue hardship.  Employers who use AI or algorithmic decision-making tools to evaluate applicants and employees must be prepared to make reasonable accommodations.  For example, the EEOC guidelines explain that a job applicant who has limited manual dexterity due to a disability may have difficulty taking tests that require the use of a keyboard, trackpad or other device.  The ADA may require an employer to provide an alternative test as a reasonable accommodation, barring undue hardship.
  • “Screening Out” Disabled Individuals.  The ADA prohibits employers from “screening out” disabled individuals who are able to perform the essential functions of the job with a reasonable accommodation.  The EEOC guidelines provide the example of a chatbot that “screens out” applicants who have a significant gap in their job history.  If the gap in employment had been related to the applicant’s disability (for instance, the individual stopped working to seek medical treatment), the chatbot may make an adverse employment decision based on disability. 
  • Disability-Related Inquiries and Medical Examinations.  The ADA prohibits employers from making certain disability-related inquiries or seeking a medical examination from job applicants before making a conditional offer of employment.  AI tools could be used to elicit this information. 

As discussed in the EEOC guidelines, AI and algorithmic decision-making tools typically apply the same set of rules to make decision as to all job applicants or employees.  While this may increase efficiency, it also significantly increases the risk of class, collective or representative action lawsuits. 

Age Discrimination

On May 5, 2022, the EEOC filed a complaint against iTutorGroup, Inc. alleging a pattern or practice of age discrimination under the Age Discrimination in Employment Act (“ADEA”).  The EEOC claims that iTutorGroup’s application software automatically rejected female applicants age 55 or older and male applicants age 60 or older.  According to the EEOC’s complaint, iTutorGroup asked applicants to provide their date of birth and programed its job application software to “screen out” older employees.  On March 29, 2020, the EEOC claims that Charging Party Wendy Pincus applied for a position online using her real birthdate, but she was immediately rejected because she was over the age of 55.  The next day, Pincus allegedly applied using a younger age and was offered an interview.  The EEOC claims that Pincus and all other aggrieved individuals met the necessary qualifications for the position, but they were denied employment because of their age.  The EEOC seeks back pay and liquidated damages on behalf of approximately 200 individuals who were “screened out” through the use of this software.

iTutorGroup has denied the EEOC’s allegations.  On April 6, 2023, Magistrate Judge Peggy Kuo of the Eastern District of New York held an initial conference.  The first phase of discovery will not close until February 2024.

Recommendations

The EEOC’s complaint against iTutorGroup, Inc. involving employer’s use of technology to make employment decisions may be the first of many to come.  To avoid potential liability, employers who use AI and algorithmic decision-making tools should confirm that the technology is used to measure abilities and qualifications that are truly necessary for the job—even for individuals who may be entitled to a reasonable accommodation.  Employers should not rely on factors that merely correlate with desired qualifications, such as an uninterrupted job history.  Employers also should confirm that the technology does not ask individuals questions that are likely to elicit information about physical or mental impairments or age.

In addition, employers should notify job applicants and employees of their right to request a reasonable accommodation and provide clear instructions on how to do so.  Staff should be trained to recognize such requests and respond as quickly as possible.  Lastly, employers should not rely exclusively on AI or algorithmic decision-making tools.  Employers should be prepared to offer alternatives as a reasonable accommodation for disabled individuals when required.

[View source.]

Written by:

Fox Rothschild LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Fox Rothschild LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide