A Focus on Transparency in AI By the ONC
The US Office of the National Coordinator for Health Information Technology (ONC) recently released a final rule titled "Health Data, Technology, and Interoperability: Certification Program Updates, Algorithm Transparency, and Information Sharing" that updates the Health IT Certification Program and implements provisions of the 21st Century Cures Act. This rule significantly impacts the transparency and interoperability of health information technology and electronic health records, with a strong focus on AI and predictive algorithms.
The final rule was published on January 9, 2024, made effective as of February 8, 2024, and corrected on March 17, 2024. The ONC stated that the transparency requirements of this rule with regards to the use of AI are "especially aligned" with the Biden Executive Order 14110, "Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence," issued on October 30, 2023, which aims to promote the responsible and ethical development and use of artificial intelligence in the federal government and the private sector. Additionally, the ONC noted that the transparency rule is also consistent with Section 8 of the E.O., “Protecting Consumers, Patients, Passengers, and Students".
The final rule applies to entities related to health information technology (HIT) and electronic health records (EHRs) which include health IT developers, health information networks (HINs), health information exchanges (HIEs) and health care providers. This includes US-based entities and any entities outside the US that have dealings with US healthcare providers or patients in a manner that involves health IT systems subject to ONC’s regulations. These entities must adhere to the certification criteria and requirements.
New transparency requirements for artificial intelligence
The rule applies the Cures Act’s “EHR Reporting Program” through the new Condition and Maintenance of Certification requirements to enhance the interoperability, transparency, and usability of health IT systems and products, as well as to protect patient data and privacy. A key component of this rule is the establishment of new transparency requirements for artificial intelligence (AI) and predictive algorithms used within certified health IT systems. This move aims to support clinical decision-making by providing a basic set of information about these algorithms, enabling users to assess them for fairness, suitability, validity, effectiveness, and safety. Certified health IT, which is used by more than 96% of hospitals and 78% of office-based physicians in the US, is now required to follow these new standards, encouraging responsible AI use.
The rule also establishes the United States Core Data for Interoperability (USCDI) Version 3 as the new standard, starting from January 1, 2026, which includes updates that aim to improve patient characteristics data to advance equity, address disparities, and support public health data interoperability. In addition, it modifies information blocking definitions and exceptions to enable information sharing between and among HITs, EHRs, HINs, HIEs and health care providers (as applicable), and introduces new interoperability-focused reporting metrics for certified health IT, offering insights into how such technology aids care delivery.
In terms of algorithm transparency, the rule defines predictive decision support interventions (DSIs) as technologies supporting decision-making through predictive outputs, encompassing techniques like machine learning and natural language processing. These predictive DSIs are required to disclose an expanded set of "source attributes" — information about their technical performance, development, and underlying quality. This includes detailed information about the training data sets, external validation processes, and measures taken to ensure fairness and eliminate bias. This predictive DSI certification criterion goes into effect January 1, 2025.
For HIT developers specifically, these changes imply a need to provide comprehensive transparency about the algorithms they develop or incorporate into their product or service offerings, including how they are created, trained, and validated. Developers must also manage intervention risks by analyzing potential risks and adverse impacts considering various factors like validity, reliability, and fairness, and then implementing practices to mitigate those risks.
By mandating HIT developers and other stakeholders to adopt more transparent practices, this rule aims to enhance the quality, safety, and equity of healthcare delivery. It positions transparency at the forefront, encouraging the responsible use of AI and other predictive technologies in healthcare, potentially transforming how health information technology supports patient care and decision-making across the healthcare ecosystem.
The ONC's final rule will have a significant impact on the health IT industry, as it will require organizations to make substantial changes to their products and processes, for example:
- The final rule may pose significant costs and burdens for organizations, as they will have to invest time and resources to update their products and processes to meet the new standards and requirements.
- The rule may also create more legal and regulatory risks, as these organizations will have to comply with stricter rules and face more oversight and potential enforcement action.
- The rule may create new ethical and social challenges for these organizations, as they will have to balance the interests and obligations of various stakeholders, as well as analyze and mitigate the potential harms arising from the algorithms they develop/provide.
- The rule may raise new questions about the trustworthiness, fairness, and accountability of health IT algorithms, as well as their impact on human autonomy, dignity, and values. At the same time, the rule represents a significant step forward in helping to build trust and transparency in the health IT ecosystem by setting minimum requirements and standards.