OCR, CMS Issue New ACA Section 1557 Final Rule Prohibiting Discrimination Related to Use of Artificial Intelligence in Health Care

Mintz - Health Care Viewpoints
Contact

Mintz - Health Care Viewpoints

Preventing discrimination and bias in connection with the use of artificial intelligence (AI) in health care are among the principal current focuses of U.S. Department of Health and Human Services (HHS) and were included in the health care directives in the recent Biden Administration Executive Order on Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (Executive Order). Consistent with these priorities, on April 26, 2024, the HHS Office for Civil Rights (OCR) and the Centers for Medicare & Medicaid Services (CMS) published a new final rule under Section 1557 of the Affordable Care Act (ACA) that aims to broadly address inequity across health care but also requires certain actions of entities covered under Section 1557 around their use of AI in clinical decision-making (Final Rule).

Final Rule Background

OCR originally became aware of potential bias related to states’ use of AI in their respective Crisis Standards of Care during the COVID-19 Public Health Emergency, which led to the OCR’s inclusion of AI and machine learning tools in the proposed Section 1557 rule on August 4, 2022 (Proposed Rule) and ultimately the Final Rule.

Scope of Patient Care Decision Support Tools

The Final Rule, under 45 C.F.R. § 92.210, prohibits recipients of Federal financial assistance, HHS, and entities established under title I of the ACA, including State Exchanges and Federally-facilitated Exchanges (collectively, Covered Entities) from discriminating on the basis of race, color, national origin, sex, age, or disability in health programs or activities through the use of “patient care decision support tools”. A patient care decision support tool, which is a newly defined term that replaced the term “clinical algorithm” in the Proposed Rule, is “any automated or non-automated tool, mechanism, method, technology, or combination thereof used by a Covered Entity to support clinical decision-making in its health programs or activities.” 

Patient care decision support tools can be used at the individual patient level or at the population health level and include tools used for prior authorization and medical necessity analysis. As discussed in the Proposed Rule, these tools may be used for activities such as screening, risk prediction, diagnosis, prognosis, clinical decision-making, treatment planning, health care operations, and allocation of resources. OCR cited recent studies in which there were examples of race and ethnicity often being used as explicit input variables and then the tools would adjust an algorithm’s output based on a patient’s race or ethnicity. These types of practices, according to OCR, may create or contribute to discrimination on the bases protected by Section 1557, and their use by Covered Entities in clinical decision-making may lead to poorer health outcomes.

Non-automated and evidence-based tools are also considered patient care decision support tools. For example, OCR cited the use of pulse oximeters and race-adjusted estimated glomerular filtration rates (eGFRs) equations as potentially leading to racial bias. The Final Rule also does not apply to the following activities when unrelated to clinical decision-making affecting patient care: automated or non-automated tools that Covered Entities use for administrative and billing-related activities; automated medical coding; fraud, waste and abuse; patient scheduling; facilities management; inventory and materials management; supply chain management; financial market investment management; or employment and staffing-related activities.

Covered Entity Requirements

In addition to the general non-discrimination prohibition, a Covered Entity must “make reasonable efforts to identify” uses of patient care decision support tools in its health programs or activities that employ input variables or factors that measure race, color, national origin, sex, age, or disability. When determining whether a Covered Entity has met the Final Rule’s “reasonable efforts to identify” requirement, OCR may take the following factors into consideration: 

  • the Covered Entity’s size and resources (e.g., a large hospital with an IT department and a health equity officer would likely be expected to make greater efforts to identify tools than a smaller provider without such resources);
  • whether the Covered Entity used the tool in the manner or under the conditions intended by the developer and approved by regulators, if applicable, or whether the Covered Entity has adapted or customized the tool;
  • whether the Covered Entity received product information from the developer of the tool regarding the potential for discrimination or identified that the tool’s input variables include race, color, national origin, sex, age, or disability; and
  • whether the Covered Entity has a methodology or process in place for evaluating the patient care decision support tools it adopts or uses, which may include seeking information from the developer, reviewing relevant medical journals and literature, obtaining information from membership in relevant medical associations, or analyzing comments or complaints received about patient care decision support tools.

Next, to the extent that it identifies these tools, a Covered Entity must then make reasonable efforts to mitigate the risk of discrimination resulting from the tool’s use in its health programs or activities. OCR did not prescribe specific risk mitigation measures, but stated that example of a risk-mitigation activity could be once a Covered Entity determines it is using a race-adjusted eGFR equation, it would discontinue using that equation and instead use the revised eGFR equation that does not adjust for race or implement protocols for use of the race-adjusted eGFR equation.

Final Rule Connection to HTI-1 Rule

The Office of the National Coordinator for Health Information Technology’s (ONC) Health Data, Technology, and Interoperability: Certification Program Updates, Algorithm Transparency, and Information Sharing (HTI-1) Rule was published in December 2023 and served as an important first step toward regulating the use of AI in health care. The HTI-1 Rule only applies to certified developers of health IT and the Final Rule applies to Covered Entity users of patient care decision support tools, which may include “predictive decision support interventions” that are regulated under the HTI-1 Rule. The HTI-1 Rule is also connected to the Final Rule because it requires that these developers disclose certain information relevant to health equity to users of their products. Obtaining this information from a developer would enable a Covered Entity to learn whether a specific patient care decision support tool (i) is included within the developer’s certified Health IT Module; and (ii) relies on attributes that measure race, color, national origin, sex, age, or disability. Once they have this information, Covered Entities can then undertake any necessary risk mitigation efforts as required by the Final Rule.

Effective Date of Patient Care Decision Support Tool Requirements

The Final Rule will become effective 60 days after the date of publication in the Federal Register; it is scheduled to be published on May 6, 2024 and would be effective, as currently scheduled, July 5, 2024. However, to allow Covered Entities enough time to come into compliance, OCR stated it is finalizing 45 C.F.R. § 92.210 requirements around patient care decision support tools with a delayed applicability date of no later than 300 days after the effective date of the Final Rule (as currently scheduled, March 2, 2025).

Conclusion

In issuing the Final Rule, OCR is putting Covered Entities on notice that they must exercise due diligence when acquiring and using patient care decision support tools. The Final Rule puts the onus on Covered Entities to (i) determine the extent to which using patient care decision support tools, as defined in the Final Rule; and (ii) if they are using these tools, to mitigate any known risks of discrimination or bias.

While these are activities OCR sees as necessary to reduce discrimination and bias through the use of AI in health care, it also declined to be overly prescriptive by, for example, not requiring Covered Entities to notify patients about their use of patient care decision support tools due to the frequent changes to the tools and notification costs. As best practice, however, Covered Entities will want to consider establishing written policies and procedures governing how patient care decision support tools will be used in decision-making, including adopting governance measures, creating mechanisms to monitor any potential impacts, developing ways to address complaints, and training staff on the proper use of these tools in decision-making.

[View source.]

DISCLAIMER: Because of the generality of this update, the information provided herein may not be applicable in all situations and should not be acted upon without specific legal advice based on particular situations. Attorney Advertising.

© Mintz - Health Care Viewpoints

Written by:

Mintz - Health Care Viewpoints
Contact
more
less

PUBLISH YOUR CONTENT ON JD SUPRA NOW

  • Increased visibility
  • Actionable analytics
  • Ongoing guidance

Mintz - Health Care Viewpoints on:

Reporters on Deadline

"My best business intelligence, in one easy email…"

Your first step to building a free, personalized, morning email brief covering pertinent authors and topics on JD Supra:
*By using the service, you signify your acceptance of JD Supra's Privacy Policy.
Custom Email Digest
- hide
- hide