AI and Healthcare: Decoding the Latest 1557 Non-Discrimination Regulations

Bricker Graydon LLP
Contact

Bricker Graydon LLP

[co-author: Mckenzie Harris]

Section 1557 of the Affordable Care Act (ACA) prohibits providers and health plans that receive reimbursement from the federal government from discriminating against individuals in the Covered Entities’ health programs. Such discrimination includes that which is based on race, color, national origin, sex, age, or disability. Though President Obama signed the ACA into law in 2010, the final rules addressing 1557 were not enacted by the Centers for Medicare & Medicaid Services (CMS) until 2016. That final rule was subject to various federal injunctions, which limited implementation and enforcement activity. On June 19, 2020, the Trump Administration issued a revised final rule that significantly curtailed the requirements of the 2016 Final Rule, citing federal injunctions and implementation costs.

On May 6, 2024, the Biden Administration published another revised 1557 Final Rule (Final Rule) to reinstate numerous provisions from the 2016 rule and add additional regulatory requirements. Providers may recall from the 2016 rule detailed notices of non-discrimination to be posted in prominent locations in their healthcare facilities and “tagline” requirements notifying individuals with limited English proficiency of the availability of free translation services – these requirements have returned in the Final Rule, in addition to some key new changes.[1] One major update in the 2024 Final Rule is that CMS extends the 1557 anti-discrimination provisions and requirements to entities that participate exclusively in Medicare Part B, i.e., private physician offices. Previously, the 1557 anti-discrimination regulations did not apply to physician offices if they only participated in the Medicare Part B program.

One of the most notable updates in the 2024 Final Rule applies to the use of artificial intelligence, and clinical algorithms in healthcare “Patient Care Decision Support Tools” (PCDST). PCDSTs are defined as “any automated or non-automated tool, mechanism, method, technology, or combination thereof used by a Covered Entity to support clinical decision-making in its health programs or activities.”[2] These tools incorporate health data to explore and compare an individual patient’s health and may be used by a provider to inform treatment decisions. PCDSTs often deliver services through digital platforms like health care apps, telehealth platforms, and patient portals, and can be built into a provider’s electronic health record system. Under the Final Rule, health programs and activities must prevent discrimination not only in their physical practice, but also when utilizing PCDSTs.

AI Data Literacy is Essential for Providers

Artificial intelligence (AI) systems utilize a massive amount of data to “train” their program. If not properly vetted and understood, this training data can lead to the AI program having training biases, inputs, and datasets, which can increase discriminatory behavior and treatment decisions.

Under the Final Rule,[3] it is now the ongoing duty of a Covered Entity to make reasonable efforts to identify how a Covered Entity uses PCDSTs in its programs that employ factors measuring individuals’ characteristics protected under 1557. This can be a challenging task for Covered Entities – typically AI developers do not share this type of proprietary data with their Covered Entity-customers. Nonetheless, if a Covered Entity does not know what training data a developer’s tool uses, but has reason to believe or should know the tool could result in discrimination prohibited by 1557, the Covered Entity should consult sources or request information from the developer. CMS will assess a Covered Entity’s compliance with this vigilance standard using multiple considerations:

  • The Covered Entity’s size and resources;
  • the intended conditions and manner of the PCDST’s usage;
  • received product warning from the developer for potential discrimination; and
  • whether a methodology was put into place by a Covered Entity for evaluating their PCDSTs.

If a Covered Entity identifies a PCDST that may lead to discriminatory treatment decisions in violation of 1557, the Final Rule requires a Covered Entity to make reasonable efforts to mitigate the risk of discrimination resulting from the tool’s use. Covered Entities need to conduct regular audits and risk assessments of their clinical AI tools, and designated personnel must also be adequately trained and updated on AI compliance when utilizing such tools. To comply with this requirement, CMS endorses voluntary compliance programs and multidisciplinary teams to assess a Covered Entity’s use of AI tools and clinical algorithms and to ensure compliance with the Final Rule. These programs would be formed by establishing policies and procedures for algorithm use, how complaints will be addressed, and their approach to training staff in properly using tools for decision-making.

The Final Rule represents a significant update in addressing discrimination in healthcare. Covered Entities are now required to be vigilant in the use of PCDSTs and clinical algorithms that may inform their treatment decisions. Such vigilance will involve a thorough understanding of all the uses of AI systems in a Covered Entity’s electronic environment, a dedicated effort to understand how these systems are utilized by providers and staff, and how these tools were developed and trained. Covered Entities will need to perform regular audits of such programs and dedicate a portion of their compliance program to monitoring the use of PCDSTs.


[1] Section 1557 has also expanded on the classification of sex discrimination. Discrimination based on sex is now expanded to include: sex characteristics, pregnancy or related conditions, sexual orientation, gender identity, and sex stereotypes. However, there is already a nationwide injunction preventing CMS from enforcing any regulation required in the Final Rule specifically as the Final Rule applies to the expanded definition of Sex Discrimination under the Final Rule.

[2] 45 CFR §92.4

[3] 45 CFR § 92.210

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations.

© Bricker Graydon LLP | Attorney Advertising

Written by:

Bricker Graydon LLP
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Bricker Graydon LLP on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide