Chest Radiographs Aid Deep Learning Model for Predicting Future Health Care Costs

Tuesday, Dec. 03, 2019

What if one scan could help predict a patient's future health care costs? That's the question Yixin Chen sought to answer through research as an undergraduate studying statistics and computer science at the University of California-Berkeley.

Chen

Chen

She presented the results of that study, "Prediction of Future Healthcare Expenses from Chest Radiographs Using Deep Learning," during a Monday session. Chen's research captured the RSNA Trainee Research Prize, Medical Student.

Although Chen, who is now pursuing a master's degree in biostatistics at the University of Michigan at Ann Arbor, is not studying to become a doctor, she worked closely with radiologist Jae Ho Sohn, MD, MS, at the UCSF Department of Radiology and Biomedical Imaging, who is a co-author on the study. Dr. Sohn received the Margulis Award at RSNA 2019 for his Radiology research, "A Deep Learning Model to Predict a Diagnosis of Alzheimer Disease Using 18F-FDG PET of the Brain."

Realizing that 50% of the total population accounts for about 97% of total U.S. health care expenditures, Chen hypothesized that using big data, deep learning and algorithms could help predict some of those costs.

"Cost is an important barrier to health care access. Having a reliable cost estimation and reliable prediction of top spenders can be a starting point for developing early interventions to improve health and help people plan accordingly," Chen said.

Chen chose to study the chest radiograph because it contains a depth of information not used by radiologists, including many general health indicators that may be utilized to predict future medical costs. She wanted to design an algorithm to see if it would be possible to predict health care expenses within five years after the chest radiograph is taken and use that data to predict the top 50% of spenders in the future.

Chest Radiographs Improved DL Model's Accuracy

Using 21,872 chest radiographs from 19,524 patients paired with the corresponding patient's total spending over the following five years, Chen built four different machine learning (ML) models with different inputs. The baseline model only used patient demographics such as age, sex, zip code and median income to predict cost. Another model used only chest radiographs to predict costs, while the last two used all variables, but had different designs.

The best regression model in Chen's experiment was able to predict five-year expenditures within a 95% confidence interval. Each of the three models that included the chest radiograph as an input was more accurate than the baseline model that only included demographic data as a predictor for future costs.

"The models can be used to identify high-risk patients, enabling early intervention to reduce risk and cost," Chen said. "It can also be used as an indicator that raises alarm against a seemingly healthy chest radiograph by clinical radiological standards."

Although the study was conducted at one institution, Chen hopes it will be replicated at a larger scale in the future.

"The idea of predicting health care costs from one single chest X-ray might be too simple to be true, but when I think about how radiologists can extract so much information about a patient's health from a chest X-ray, and it makes a lot of sense to estimate how much money that patient is going to spend based on how sick they are," Chen said. "I'm excited to see where this kind of thinking about health care data takes us in the future."