AI can Help Rule out Abnormal Pathology on Chest X-Rays

A commercial artificial intelligence (AI) tool used off-label was effective at excluding pathology and had equal or lower rates of critical misses on chest X-ray than radiologists, according to a study published today in Radiology, a journal of the Radiological Society of North America (RSNA).

Recent developments in AI have sparked a growing interest in computer-assisted diagnosis, partly motivated by the increasing workload faced by radiology departments, the global shortage of radiologists and the potential for burnout in the field. Radiology practices have a high volume of unremarkable (no clinically significant findings) chest X-rays, and AI could possibly improve workflow by providing an automatic report.

Researchers in Denmark set out to estimate the proportion of unremarkable chest X-rays where AI could correctly exclude pathology without increasing diagnostic errors. The study included radiology reports and data from 1,961 patients (median age, 72 years; 993 female), with one chest X-ray per patient, obtained from four Danish hospitals.

"Our group and others have previously shown that AI tools are capable of excluding pathology in chest X-rays with high confidence and thereby provide an autonomous normal report without a human in-the-loop," said lead author Louis Lind Plesner, M.D., from the Department of Radiology at Herlev and Gentofte Hospital in Copenhagen, Denmark. "Such AI algorithms miss very few abnormal chest radiographs. However, before our current study, we didn’t know what the appropriate threshold was for these models."

The research team wanted to know whether the quality of mistakes made by AI and radiologists was different and if AI mistakes, on average, are objectively worse than human mistakes.

The AI tool was adapted to generate a chest X-ray “remarkableness” probability, which was used to calculate specificity (a measure of a medical test’s ability to correctly identify people who do not have a disease) at different AI sensitivities.

Two chest radiologists, who were blinded to the AI output, labeled the chest X-rays as "remarkable" or "unremarkable" based on predefined unremarkable findings. Chest X-rays with missed findings by AI and/or the radiology report were graded by one chest radiologist - blinded to whether the mistake was made by AI or radiologist - as critical, clinically significant or clinically insignificant.

The reference standard labeled 1,231 of 1,961 chest X-rays (62.8%) as remarkable and 730 of 1,961 (37.2%) as unremarkable. The AI tool correctly excluded pathology in 24.5% to 52.7% of unremarkable chest X-rays at greater than or equal to 98% sensitivity, with lower rates of critical misses than found in the radiology reports associated with the images.

Dr. Plesner notes that the mistakes made by AI were, on average, more clinically severe for the patient than mistakes made by radiologists.

"This is likely because radiologists interpret findings based on the clinical scenario, which AI does not," he said. "Therefore, when AI is intended to provide an automated normal report, it has to be more sensitive than the radiologist to avoid decreasing standard of care during implementation. This finding is also generally interesting in this era of AI capabilities covering multiple high-stakes environments not only limited to health care."

AI could autonomously report more than half of all normal chest X-rays, according to Dr. Plesner. "In our hospital-based study population, this meant that more than 20% of all chest X-rays could have been potentially autonomously reported using this methodology, while keeping a lower rate of clinically relevant errors than the current standard," he said.

Dr. Plesner noted that a prospective implementation of the model using one of the thresholds suggested in the study is needed before widespread deployment can be recommended.

Plesner LL, Müller FC, Brejnebøl MW, Krag CH, Laustrup LC, Rasmussen F, Nielsen OW, Boesen M, Andersen MB.
Using AI to Identify Unremarkable Chest Radiographs for Automatic Reporting.
Radiology. 2024 Aug;312(2):e240272. doi: 10.1148/radiol.240272

Most Popular Now

Unlocking the 10 Year Health Plan

The government's plan for the NHS is a huge document. Jane Stephenson, chief executive of SPARK TSL, argues the key to unlocking its digital ambitions is to consider what it...

Alcidion Grows Top Talent in the UK, wit…

Alcidion has today announced the addition of three new appointments to their UK-based team, with one internal promotion and two external recruits. Dr Paul Deffley has been announced as the...

AI can Find Cancer Pathologists Miss

Men assessed as healthy after a pathologist analyses their tissue sample may still have an early form of prostate cancer. Using AI, researchers at Uppsala University have been able to...

New Training Year Starts at Siemens Heal…

In September, 197 school graduates will start their vocational training or dual studies in Germany at Siemens Healthineers. 117 apprentices and 80 dual students will begin their careers at Siemens...

AI, Full Automation could Expand Artific…

Automated insulin delivery (AID) systems such as the UVA Health-developed artificial pancreas could help more type 1 diabetes patients if the devices become fully automated, according to a new review...

How AI could Speed the Development of RN…

Using artificial intelligence (AI), MIT researchers have come up with a new way to design nanoparticles that can more efficiently deliver RNA vaccines and other types of RNA therapies. After training...

MIT Researchers Use Generative AI to Des…

With help from artificial intelligence, MIT researchers have designed novel antibiotics that can combat two hard-to-treat infections: drug-resistant Neisseria gonorrhoeae and multi-drug-resistant Staphylococcus aureus (MRSA). Using generative AI algorithms, the research...

AI Hybrid Strategy Improves Mammogram In…

A hybrid reading strategy for screening mammography, developed by Dutch researchers and deployed retrospectively to more than 40,000 exams, reduced radiologist workload by 38% without changing recall or cancer detection...

Penn Developed AI Tools and Datasets Hel…

Doctors treating kidney disease have long depended on trial-and-error to find the best therapies for individual patients. Now, new artificial intelligence (AI) tools developed by researchers in the Perelman School...

Are You Eligible for a Clinical Trial? C…

A new study in the academic journal Machine Learning: Health discovers that ChatGPT can accelerate patient screening for clinical trials, showing promise in reducing delays and improving trial success rates. Researchers...

Global Study Reveals How Patients View M…

How physicians feel about artificial intelligence (AI) in medicine has been studied many times. But what do patients think? A team led by researchers at the Technical University of Munich...

New AI Tool Addresses Accuracy and Fairn…

A team of researchers at the Icahn School of Medicine at Mount Sinai has developed a new method to identify and reduce biases in datasets used to train machine-learning algorithms...