Don't use AI to get around the ADA!

Constangy, Brooks, Smith & Prophete, LLP
Contact

Constangy, Brooks, Smith & Prophete, LLP

Warning -- I'm about to go on a rant.

Do you ever read something in the news that just makes you go, "Sheesh, people!!!" Or words to that effect?

And, no, I am not talking about the Presidential Election.

The Wall Street Journal had an article this week about employers who use artificial intelligence to determine whether their executives are at risk of developing dementia. Here's a link, but you may need a paid subscription to access.

The technology, I admit, sounds pretty cool in some ways. The AI can apparently tell from people's patterns of speech whether they are at risk . . . long before a qualified human physician would be able to diagnose the condition.

Although impressed with this development from a tech standpoint, in my head I was screaming, "What about the ADA? What about the ADA? Has anybody thought about the ADA?"

The article did not discuss the fact that an employer's using AI in this way might violate the Americans with Disabilities Act. But I think it is a big risk for employers. Worse than the risk that a perfectly compos mentis executive might get dementia six or seven years down the road.

The article says that the AI is correct in about 80 percent of cases. In other words, the AI is wrong in about 20 percent, or one-fifth, of cases. And, of course, the employer won't realize that the AI was wrong until it's too late because the AI is predicting future dementia, not diagnosing current dementia.

I guess I don't have dementia (yet) because I am able to recall that in early 2023, I asked ChatGPT to write a blog post for me on Groff v. DeJoy, a religious accommodation case that at the time was going to be heard by the U.S. Supreme Court. (That case has since been heard and decided.) ChatGPT did a nice job writing my post, except for one little detail . . . it said that Groff was a disability accommodation case under the Rehabilitation Act of 1973 instead of a religious accommodation case under Title VII. It also said that the case had multiple plaintiffs instead of a single plaintiff. Here's the quote it gave me:

“The Supreme Court has recently announced that it will review the case of Groff v. DeJoy, a case that has the potential to significantly impact the rights of individuals with disabilities in the workplace. This case was brought forth by a group of individuals with disabilities who argue that the United States Postal Service (USPS) failed to accommodate their disabilities in violation of the Rehabilitation Act of 1973.

(Emphasis was mine.)

At least ChatGPT got the name of the case right.

Since I wrote that post, we've been hearing about lawyers writing briefs with the "help" of AI. Then they end up being sanctioned by the courts because the AI made up case law, meaning the lawyers were citing nonexistent court decisions in support of their clients' positions. As a result, many courts now have rules requiring AI-using attorneys to check their cases the old-fashioned way before submitting their briefs, and to certify to the courts that they have done so.

And we want to use AI to diagnose whether a person will have a devastating medical condition at some indeterminate time in the future? And we want to use that "information" in making employment decisions? 

Um, yes it will.

Seven reasons why this probably violates the ADA

Here is why using AI in this way is going to get employers in trouble under the ADA and also under many state disability protection laws:

No. 1: Dementia is a disability, as are many other medical conditions.

No. 2: I feel sure that the U.S. Equal Employment Opportunity Commission, which enforces the employment provisions of the ADA, will say that a medical assessment conducted by AI is a "medical examination." Heck, it's an ADA "medical examination" for a frontline supervisor to casually ask an employee if she's limping because she has a bad hip.

No. 3: The ADA prohibits employers from requiring job applicants to undergo any sort of "medical examination" before a conditional offer of employment has been made.

No. 4: The ADA allows employers to conduct "medical examinations" after a conditional offer of employment has been made, but the information obtained cannot be used to disqualify the offeree. The only exception applies if the medical examination indicates that the offeree cannot perform the essential functions of the job, with or without a reasonable accommodation. I don't think a four-out-of-five chance of getting dementia in six years is going to cut it.

No. 5: Generally, it violates the ADA for an employer to discriminate against an applicant, offeree, or employee based on a concern that the individual "might" develop a medical condition in the future.

No. 6: An employer may not require a current employee to undergo a "medical examination" unless the examination is "job-related and consistent with business necessity." In other words, there has to be a job-related reason for requiring the medical examination, such as a performance issue or behavior concern that could reasonably be attributed to a medical condition. Sending an executive (or any other employee) for a medical examination to determine whether the individual is at risk for developing a medical condition in the future is not going to cut it.

No. 7: Merely asking these questions without a legal justification is an ADA violation. Even if the employer never actually uses the information against the employee. And, of course, if the information is used against the employee -- look out!

I will end on a positive note. If an employee is showing objective signs of developing dementia (or some other medical condition that seems to be affecting job performance or behavior), the ADA would allow the employer to send the employee for a medical examination to determine

  • whether the employee can perform the essential functions of the job,
  • whether reasonable accommodation is necessary or possible, and
  • the types of accommodations that might be advisable.

In this context, the medical examination is likely to be "job-related and consistent with business necessity." And the use of AI to assist with the diagnosis (or reasonable accommodation recommendations) should not create an ADA problem.

*whew* Thanks, you guys. I feel better now. 

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Constangy, Brooks, Smith & Prophete, LLP

Written by:

Constangy, Brooks, Smith & Prophete, LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Constangy, Brooks, Smith & Prophete, LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide