AI, Government Contractors, and Employment Discrimination

Robinson+Cole Data Privacy + Security Insider
Contact

Increasingly, companies use AI to evaluate job applications and make interviewing or hiring decisions. However, government contractors who use artificial intelligence to evaluate job applications should ensure that the AI not only complies with anti-discrimination laws but also fulfills their contractual responsibilities. Federal contractors with contracts of $10,000 or more are subject to Executive Order 11246, which prohibits discrimination against job applicants and employees based on race, color, sex, sexual orientation, gender identity, religion, or national origin during the performance of the contract. This means that a government contractor accused of employment discrimination has to worry about adverse contract actions as well as other potential legal consequences.

Moreover, the federal government has made clear that the use of AI can run afoul of anti-discrimination laws. In April 2023, the Consumer Finance Protection Bureau, the Department of Justice Civil Rights Division, the EEOC, and the FTC issued a “Joint Statement on Enforcement Efforts Against Discrimination and Bias in Automated Systems.” The Joint Statement explained, in literal bold letters: “Automated Systems May Contribute to Unlawful Discrimination and Otherwise Violate Federal Law.” The agencies “pledge[d] to vigorously use [their] collective authorities to protect individuals’ rights regardless of whether legal violations occur through traditional means or advanced technologies.”

Soon after issuing this statement, the EEOC made clear that it meant business. In August 2023, three integrated English-language tutoring companies going by the name “iTutorGroup” paid $365,000 to settle an EEOC lawsuit alleging that they had used AI to discriminate against older job applicants. Remarking on the lawsuit, EEOC Chair Charlotte A. Burrows wrote, “Age discrimination is unjust and unlawful. Even when technology automates the discrimination, the employer is still responsible.”

To guard against employment discrimination arising from AI bias, government contractors should practice rigorous AI governance – that is, the ability to direct, manage, and monitor an organization’s AI activities.  AI governance is too complex to explain entirely here, but put simply government contractors – and other employers – may wish to do the following, among other things:

  1. Establish written policies and procedures for the use of AI;
  2. Carefully monitor their AI to ensure that it is acting within legal and contractual bounds;
  3. Designate a person or team responsible for protecting against AI bias.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Robinson+Cole Data Privacy + Security Insider | Attorney Advertising

Written by:

Robinson+Cole Data Privacy + Security Insider
Contact
more
less

Robinson+Cole Data Privacy + Security Insider on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide