Researchers at the Sylvester Comprehensive Cancer Center and Desai Sethi Urology Institute of the University of Miami, Florida, have discovered that generative adversarial networks (GANs) show promise in producing high-quality synthetic images derived from MRI scans.
According to Dr. Himanshu Arora, PhD, an Assistant Professor at both Sylvester and Desai Sethi Urology Institute, the technology developed in this study represents a preliminary step toward building advanced data augmentation models, where newly created digital images can be used for further analysis. Although still in the early stages, the results appear highly promising.
The research team utilized T2-weighted prostate MRI images from the BLaStM trial (NCT02307058) and other sources to train Single Natural Image GAN (SinGAN) models for creating a generative model. This SinGAN generative model was then integrated with a deep learning semantic segmentation pipeline to segment prostate biopsies on 2D MRI and histology slices. Scientists with different levels of experience (over 10 years, 1 year, and none) participated in a quality control assessment.
On average, the group of scientists with over 10 years of experience correctly identified conventional versus synthetic imaging 67% of the time. The group with one year of experience achieved 58% accuracy, while the group with no experience managed 50% accuracy.
The research team also compared outcomes when synthetic images were handpicked by the study team versus when images were randomly chosen after passing through the segmentation pipeline. The results revealed no significant difference in participants' quality control performance based on correct scores (P = .725) between their first and second tests. However, the number of false negatives increased between the two tests, with 47% of the synthetic images in the assessment being mistakenly identified as conventional images.
A board-certified radiologist conducted a blind assessment, grading images on a scale of 1 to 10 based on readability and the ability to generate a report from the image quality alone. Higher scores indicated better quality. The radiologist gave synthetic images a mean grade of 6.2 and conventional images a mean grade of 5.5 (P = .4839).
The study's authors propose that machine learning models hold potential as a supplement to medical decision-making, rather than as a replacement. Dr. Arora explained that timely diagnosis and prognosis assessment remain challenges in prostate cancer, leading to numerous deaths and increased risk of disease progression. While the human eye cannot be replaced in medical decision-making, advancements in technology may potentially aid radiation oncologists in making well-timed decisions. References 1. Xu IRL, Booven DJV, Goberdhan S, et al. Generative adversarial networks can create high quality artificial prostate cancer magnetic resonance images. J Pers Med.2023;13(3):547. doi:10.3390/jpm13030547 2. Scientists pioneer research to harness power of machine learning in prostate cancer imaging. News release. University of Miami Health System, Miller School of Medicine. March 23, 2023. Accessed March 29, 2023. https://www.newswise.com/articles/scientists-pioneer-research-to-harness-power-of-machine-learning-in-prostate-cancer-imaging?sc=sphr&xy=10016681
Comments