Solutions from RSNA Exhibitors Help Physicians Focus on Patients

Sunday, Dec. 01, 2019

Internal Data, Internal Solutions: Using Clinical Analytics for AI Model Validation

Along with increased optimism for the potential benefits of artificial intelligence (AI), one of the key takeaways from last year's RSNA meeting was the need for solutions that address the practical and technical challenges to clinical adoption of the technology without adding administrative burdens. Radiologists are looking for those solutions this year.

Boonn

Boonn

Messinger

Messinger

Kim

Kim

AI model validation is the next key challenge now that commercial diagnostic and workflow optimization algorithms available on some AI marketplaces are integrated with PACS and the radiology reporting solutions. Instead of seeking new validation-specific tools, radiologists and researchers are turning to a clinical analytics solution already used for data mining and radiology performance and quality improvement.

Three factors drive the unlikely combination of clinical analytics and algorithm validation. First is the need to test algorithms against internal patient data while keeping that data safely on site. Second is the impracticality of requiring radiologists and radiology groups struggling with burnout to sift through huge volumes of radiology exams to extract that data. The third is the nature of machine learning algorithms themselves.

"AI models can be brittle," said Woojin Kim, MD, chief medical information officer for Nuance Communications. "If you take a model that's trained at one hospital, then take that model and deploy it at another institution with different scanners, protocols, or patient demographics, that model might not perform as expected."

Machine learning algorithms are also subject to concept drift and data drift, where performance may decline over time because of unexpected anomalies, relationships, or changes either in training data or new data. "That requires continuing surveillance of AI model performance using internal data," Dr. Kim said.

Model validation should not be a barrier to adopting AI to automate repetitive tasks and optimize workflows, said Jonathan Messinger, MD, chief of radiology at South Miami Hospital, FL. "Additional responsibilities placed on physicians certainly are a factor in burnout. We need solutions that give us more time to collaborate with our physician colleagues and be in front of patients," said Dr. Messinger, who is also a clinical professor of radiology at the Herbert Wertheim School of Medicine at Florida International University in Miami and medical director of imaging informatics at Radiology Associates of South Florida and the Baptist Health System.

One validation solution is Nuance's mPower Clinical Analytics platform. It uses advanced text mining and natural language processing to extract needed data from radiology reports automatically in far less time than manual methods. That makes it practical to create validation and surveillance datasets using the hospital's own data while informing continuous improvement programs. Data stays safely onsite or can be anonymized and shared with developers through the Nuance AI Marketplace to improve algorithm performance.

"Radiologists can feel confident testing algorithms from outside developers against their own data," said William Boonn, MD, chief medical information officer for Nuance. "It also enables them to compare different algorithms for the same diagnosis to find the best match for their facility, or to select models that can optimize performance in specific clinical areas."

"Using clinical analytics for performance improvements while driving adoption of AI can have a tremendously positive impact on clinical and financial outcomes and physician burnout," said Dr. Boonn. "That's a real win-win for radiology."