Article

Cross modality medical image synthesis for improving liver segmentation

Details

Citation

Rafiq M, Ali H, Mujtaba G, Shah Z & Azmat S (2025) Cross modality medical image synthesis for improving liver segmentation. Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization, 13 (1). https://doi.org/10.1080/21681163.2025.2476702

Abstract
Deep learning-based computer-aided diagnosis (CAD) of medical images requires large datasets. However, the lack of large publicly available labelled datasets limits the development of deep learning-based CAD systems. Generative Adversarial Networks (GANs), in particular, CycleGAN, can be used to generate new cross-domain images without paired training data. However, most CycleGAN-based synthesis methods lack the potential to overcome alignment and asymmetry between the input and generated data. We propose a two-stage technique for the synthesis of abdominal MRI using cross-modality translation of abdominal CT. We show that the synthetic data can help improve the performance of the liver segmentation network. We increase the number of abdominal MRI images through cross-modality image transformation of unpaired CT images using a CycleGAN inspired deformation invariant network called EssNet. Subsequently, we combine the synthetic MRI images with the original MRI images and use them to improve the accuracy of the U-Net on a liver segmentation task. We train the U-Net on real MRI images and then on real and synthetic MRI images. Consequently, by comparing both scenarios, we achieve an improvement in the performance of U-Net. In summary, the improvement achieved in the Intersection over Union (IoU) is 1.17%. The results show the potential to address the data scarcity challenge in medical imaging.

Keywords
Computer aided diagnosis; CycleGAN; medical imaging; MRI; segmentation

Journal
Computer Methods in Biomechanics and Biomedical Engineering: Imaging & Visualization: Volume 13, Issue 1

StatusPublished
Publication date31/12/2025
Publication date online31/03/2025
Date accepted by journal01/03/2025
URLhttp://hdl.handle.net/1893/36952
PublisherInforma UK Limited
ISSN2168-1163
eISSN2168-1171