Background:Radiologists play a critical role in communicating important clinical information identified through medical imaging, and the radiology report is the cornerstone of this process. While primarily directed to the referring clinician, information in the report may be sought by stakeholders beyond the immediate referrer, and increasingly, patients themselves may access and interpret their reports.1
The medical lexicography used in a typical report is often optimised for the referring doctor, but can pose a barrier to a patient less familiar with such jargon, limiting an opportunity to engage them directly in their care. Furthermore, effective communication has numerous benefits and has been shown to improve overall patient outcomes. Improving patient health literacy increases treatment adherence, aids informed decision-making, empowers patients and reduces litigation. Unfortunately, miscommunication within the clinician-patient relationship persists as a primary source of patient dissatisfaction, and is a challenging obstacle to overcome given the increasing complexity and breadth of medical care2–4.
An effective written report should consider a myriad of factors, including style, organisation and content. However a key factor to consider is its readability, defined as the ease of understanding or comprehension. The American Medical Association (AMA) and National Institute of Health (NIH) recommend that patient materials be written at the 6th-8th grade reading level respectively5,6.
One such method to quantify readability is via readability scores. These provide a numerical measure of the complexity of a piece of text, taking into account factors such as sentence length, word syllables and vocabulary difficulty. Reflecting on these scores can aid communicators to tailor their content to an appropriate reading and comprehension level6,7. One of the most common scoring systems is the Flesh-Kincaid Reading Ease (FKRE) score, based on number of words, syllables and sentences within a passage. Using these details, the readability is scored on scale of 0 to 100, with higher scores corresponding to greater readability (see figure 1).
With the recent explosion in capability of natural language processing (NLP) models, text simplification can drastically reduce linguistic complexity to improve comprehension8,9. NLPs simplify text through a set of defined editing principles, including reducing redundancy, summarising content, paraphrasing complex sentences, replacing jargon and adjusting vocabulary. For example “the ominous, looming clouds engulfed the hill” can be altered to a more simple form “the gloomy clouds covered the hill”.
Despite the importance a radiology report holds as a keystone of the communication process, the amount of formal training dedicated to crafting an effective report is extremely low, with one study showing 86% of residencies received no more than one hour of didactic instruction in radiology reporting per year8.
We theorised that artificial intelligence (AI)-based NLP models may simplify radiology reports into a format more accessible to patients. An AI model could provide an efficient method of creating tailored radiology reports with minimal use of radiologist time, however concerns have been raised over the accuracy of AI-generated text.
In this study, we explore the use of ChatGPT - a free language processing model trained on GPT-3.5 architecture - to re-write radiology reports with an emphasis on improving clarity for readers across varying levels of health literacy.
Methods:Thirty emergency department CT reports from a tertiary hospital medical imaging department were randomly selected over a period of one week. ChatGPT was asked to revise the report conclusions to a level understandable by a 14-year-old while preserving the meaning of the original text.
The readability of the original and simplified conclusions was evaluated using the FKRE score. Mean and standard deviation of the FKRE score were calculated and a paired t-test was used to compare the means of both groups.
Each ‘simplified’ report was reviewed by a consultant radiologist (AS) and training registrar (NM) to determine whether the simplified reports were accurate and maintained appropriate context. A generalised thematic analysis was performed to identify common ways in which communication and readability was altered