Automated Patient Friendly Report Summary

Purpose

Creating an AI algorithm to inform the development of a patient-friendly radiology report.

Tag(s)

Non-interpretative

Panel

Reading Room Facing

Define-AI ID

21200005

Originator

Ali Tejani
Lead Ali Tejani

Panel Chair

Ben Wandtke 

Panel Reviewers

Melissa Davis, Olga Brook, Ani Mannari, Alex Towbin, & Ben Wandtke

License

Creative Commons 4.0
Status Public Comment

 
   
   
   
   
   
   
   
   
                               

Clinical Implementation


Value Proposition

 

Conversion of radiology reports to lay language is of utmost importance as patients now have increased access to their medical records, including radiology reports, immediately after the reports are finalized by the radiologist. Historically, radiology reports consist of medical jargon intended for referring providers. However, restructuring reports to minimize technical verbiage and improve patient comprehension is increasingly necessary as patients now often access their reports before the ordering provider has had an opportunity to review them and discuss the findings with the patient. This would alleviate patients’ anxiety with the unfamiliar terminology commonly found in radiology reports and foster patients’ ability to take a more active role in their own care. 

Patient-friendly reports would provide an opportunity to strengthen interactions between radiologists and patients, increasing the value that radiologists bring to each patient’s care. Accordingly, an artificial intelligence (AI) solution that translates traditional reports to plain language can create an empowering patient-centric experience that allows for increased patient engagement and, potentially, improved care outcomes. The purpose of this use case is to outline an algorithm that analyzes radiology reports for medical jargon, translates terms to plain language, and presents a translated final report to patients through an  interactive, simple user interface.

Narrative(s)

 

A 40-year-old patient presents to the emergency department (ED) with fever and shortness of breath, which prompts a chest x-ray (CXR) followed by a computed tomography (CT) scan of her chest for an incidental, suspicious pulmonary nodule. The ED physician informs the patient that she has acute pneumonia, and sends her home with a prescription for antibiotics without mentioning the pulmonary nodule. The patient receives a notification from her online electronic medical record (EMR) portal the following day indicating that she may review her recent imaging reports from the ED visit. While reviewing the CXR and CT scan reports, she notices that the radiologist mentioned a “well-circumscribed mass in the left upper lobe with smooth margins and calcification” on the CXR, described as a “solitary solid nodule measuring 7-8 mm” on the CT report. As the patient performs an online search for these findings, she becomes anxious, as she has a family history of breast cancer and is now worried about possible underlying cancer that has now metastasized to her lungs. She urgently presents to her nearest ED to discuss these findings with a provider as soon as possible, as she is unable to schedule a walk-in appointment with her primary care provider.

Alternatively, the same scenario takes place at a center where the consulting radiology practice has implemented an automated patient-friendly report algorithm. The patient views her radiology reports the following day after her ER visit, but she can hover over confusing terms and phrases with her cursor to view translated text in plain language with simplified anatomic diagrams. Alternatively, the patient receives an additional document along with the original report that translates jargon to plain language as a companion report. For example, the field labeled “cardiac” on her companion report reads “normal heart size” as opposed to “no cardiomegaly” on the original report. Based on the translated report, the patient learns that the incidental pulmonary nodule can be a common finding that does not necessarily imply malignant disease. She also learns about complex terms, such as “air bronchograms” and “consolidation,” used to describe her pneumonia. Furthermore, she sees that the radiologist has linked recommendations for follow-up imaging based on population-based studies and validated guidelines. She uses the EMR portal to make a follow-up appointment for the CT scan. 

Workflow Description

 

The radiology report is dictated by the radiologist using voice-to-text dictation software. Normally, the finalized report is sent to an interface engine, which routes the report to the radiology information system (RIS) or EMR and the PACS. In this new workflow, the original report can either be translated within the voice-to-text dictation software or a secondary AI orchestrator. If the algorithm runs within the voice-to-text dictation software, both the original report and translated report can be sent via the same process. If a secondary AI orchestrator is being used, the original report will be sent to the orchestrator via the interface engine. The report will be translated by the algorithm. An interface would then send both the original report and the translated report to the downstream systems (RIS, EMR, PACS). 


Considerations for Dataset Development


Procedure

All Procedures

Views

No images required

Age

All ages

Sex

Male, Female

Technical Specifications


Input

 

Radiology Report Text

 

Procedure

All

Views

N/A

Data Type

Rich text (HL7, FHIR response, database query)

Modality

All

Body Region

All





Primary Outputs


Translated report

Definition

Plain language companion report derived from original radiology report.

Data Type

Rich text (HL7, FHIR response, database query)

Value Set

N/A

Units

N/A



Alternative Output


Interactive translated report

Definition

Original report with hyperlinked text to a “pop-up window” including detailed description of technical terms in plain language with optional anatomic diagrams, epidemiologic data/outcomes, and guidelines from governing bodies, as applicable.

Data Type

Rich text (HL7, FHIR response, database query)

Value Set

N/A

Units

N/A




Future Development Ideas


This algorithm generates plain language companion reports based on original radiology reports containing medical jargon. Further extension of this algorithm could generate interactive reports that allow the patient to hover over terms for “pop-up” windows that contain more detail such as anatomic diagrams, recommendations, pertinent demographic/epidemiologic information associated with the term(s) or phrases. This extension may require creation of an additional user interface that interacts with the PACS and EMR through the interface engine.

References


  1. Alarifi M, Patrick T, Jabour A, Wu M, Luo J. Designing a Consumer-Friendly Radiology Report using a Patient-Centered Approach. J Digit Imaging. 2021 Jun;34(3):705-716. doi: 10.1007/s10278-021-00448-z

  2. Johnson AJ, Easterling D, Williams L, Glover S, Frankel RM. Insight from patients for radiologists: improving our reporting systems. J Am Coll Radiol. 2009;6(11):786-794. doi:10.1016/j.jacr.2009.07.010

  3. Oh SC, Cook TS, Kahn CE Jr. PORTER: a Prototype System for Patient-Oriented Radiology Reporting. J Digit Imaging. 2016 Aug;29(4):450-4. doi: 10.1007/s10278-016-9864-2

  4. Short RG, Middleton D, Befera NT, Gondalia R, Tailor TD. Patient-centered radiology reporting: using online crowdsourcing to assess the effectiveness of a web-based interactive radiology report. J Am Coll Radiol. 2017;14(11):1489-1497. doi:10.1016/j.jacr.2017.07.027