Presentation + Paper
29 March 2024 Ambient-Pix2PixGAN for translating medical images from noisy data
Wentao Chen, Xichen Xu, Jie Luo, Weimin Zhou
Author Affiliations +
Abstract
Image-to-image translation is a common task in computer vision and has been rapidly increasing the impact on the field of medical imaging. Deep learning-based methods that employ conditional generative adversarial networks (cGANs), such as Pix2PixGAN, have been extensively explored to perform image-to-image translation tasks. However, when noisy medical image data are considered, such methods cannot be directly applied to produce clean images. Recently, an augmented GAN architecture named AmbientGAN has been proposed that can be trained on noisy measurement data to synthesize high-quality clean medical images. Inspired by AmbientGAN, in this work, we propose a new cGAN architecture, Ambient-Pix2PixGAN, for performing medical image-to-image translation tasks by use of noisy measurement data. Numerical studies that consider MRI-to-PET translation are conducted. Both traditional image quality metrics and task-based image quality metrics are employed to assess the proposed Ambient-Pix2PixGAN. It is demonstrated that our proposed Ambient-Pix2PixGAN can be successfully trained on noisy measurement data to produce high-quality translated images in target imaging modality.
Conference Presentation
(2024) Published by SPIE. Downloading of the abstract is permitted for personal use only.
Wentao Chen, Xichen Xu, Jie Luo, and Weimin Zhou "Ambient-Pix2PixGAN for translating medical images from noisy data", Proc. SPIE 12929, Medical Imaging 2024: Image Perception, Observer Performance, and Technology Assessment, 129290H (29 March 2024); https://doi.org/10.1117/12.3008260
Advertisement
Advertisement
RIGHTS & PERMISSIONS
Get copyright permission  Get copyright permission on Copyright Marketplace
KEYWORDS
Medical imaging

Positron emission tomography

Image quality

Signal to noise ratio

Magnetic resonance imaging

Signal detection

Image processing

Back to Top