EEOC Releases New Guidance on Algorithms and Disability Bias in Hiring

Dunlap Bennett & Ludwig PLLC
Contact

On May 12, 2022, the Equal Employment Opportunity Commission (EEOC) and the U.S. Department of Justice issued guidance to caution employers about using artificial intelligence (AI) and software tools to make employment decisions. The guidance, titled “The Americans with Disabilities Act and the Use of Software, Algorithms, and Artificial Intelligence to Assess Job Applicants and Employees,” warns that using these tools without safeguards could result in an Americans with Disabilities Act (ADA) violation. Produced as part of the Artificial Intelligence and Algorithmic Fairness Initiative launched in October 2021, the guidance reflects the agency’s growing interest in employer use of AI, including machine learning, natural language processing, and other emerging technologies in employment decisions.

According to April 2022 testimony given to the House Education and Labor Subcommittee on Civil Rights and Human Services, the EEOC’s AI initiative is a key component of the agency’s efforts to advance its systemic work. The initiative’s goal is to educate applicants, employees, employers, and technology vendors about the legal requirements in this area and to ensure that new hiring tools do not perpetuate discrimination. This guidance is the first substantive output of the initiative and provides key insights into the EEOC’s thinking on these tools and potential enforcement priorities in the area moving forward. 

Tools That Unlawfully Screen Out Persons With Disabilities 

The guidance explains that a tool might “screen out” an individual on the basis of disability if the individual’s disability prevents the individual from meeting selection criteria implemented by the tool or results in a negative rating from the tool based on those criteria. If the individual loses a job opportunity as a result, a violation of the ADA may occur. Examples include screens that automatically eliminate applicants with significant gaps in their employment history (which may be the result of a disability) or measure and make assessments on physical or mental traits, such as speech patterns or the ability to solve certain games, which may be impacted by a disability. 

Importantly, employers may not rely on a vendor’s assessment that a tool is “bias free” for validation purposes. Such assessments may only focus on other protected characteristics, such as race or gender, and not properly evaluate impact on the basis of disability. Further, unlike other protected characteristics, each disability is unique in terms of the limitations it imposes. A general assessment of a tool is unlikely to cover all the potential ways a disability may interact with that tool. Finally, a vendor assessment may be invalid or poorly designed. As the ultimate decision maker, the employer is liable for the results produced by the tool and has the responsibility for ensuring legal compliance. 

Duty to Provide Reasonable Accommodation 

In its guidance, the EEOC notes that employers using evaluation tools must consider reasonable accommodations for applicants or employees, which can include accessibility accommodations for persons who have difficulty taking tests or using tools due to dexterity limitations or who require adaptive technologies, such as screen-readers or closed captioning, to effectively apply. This obligation applies to an employer even if it has outsourced the evaluation or operation of the tool to a third party or vendor.

The guidance further explains that the ADA’s reasonable accommodation requirement may necessitate waiving the use of these tools in certain situations. AI and algorithmic tools are designed to measure an individual’s suitability for a particular position. Employers will need to consider requests for accommodation, including waiver, from applicants who are unable to meet the criteria used by a particular tool to measure fit but are otherwise able to show that they can perform essential job functions. This is the case even when the tools are validated for certain traits. As discussed above, the EEOC believes that the unique nature of each disability makes it possible for an individual to show that a generally validated screen still unlawfully screens that individual out on the basis of the individual’s particular limitations.

As a takeaway, employers should remember that employees are humans, and sometimes decisions about humans need to be made by humans, not computers. Metrics tracked by computers should be just one piece of the puzzle when evaluating employees’ performance. 

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Dunlap Bennett & Ludwig PLLC | Attorney Advertising

Written by:

Dunlap Bennett & Ludwig PLLC
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Dunlap Bennett & Ludwig PLLC on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide