Loading…

TAI-GAN: A Temporally and Anatomically Informed Generative Adversarial Network for early-to-late frame conversion in dynamic cardiac PET inter-frame motion correction

Inter-frame motion in dynamic cardiac positron emission tomography (PET) using rubidium-82 (82Rb) myocardial perfusion imaging impacts myocardial blood flow (MBF) quantification and the diagnosis accuracy of coronary artery diseases. However, the high cross-frame distribution variation due to rapid...

Full description

Saved in:
Bibliographic Details
Published in:Medical image analysis 2024-08, Vol.96, p.103190, Article 103190
Main Authors: Guo, Xueqi, Shi, Luyao, Chen, Xiongchao, Liu, Qiong, Zhou, Bo, Xie, Huidong, Liu, Yi-Hwa, Palyo, Richard, Miller, Edward J., Sinusas, Albert J., Staib, Lawrence, Spottiswoode, Bruce, Liu, Chi, Dvornek, Nicha C.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Inter-frame motion in dynamic cardiac positron emission tomography (PET) using rubidium-82 (82Rb) myocardial perfusion imaging impacts myocardial blood flow (MBF) quantification and the diagnosis accuracy of coronary artery diseases. However, the high cross-frame distribution variation due to rapid tracer kinetics poses a considerable challenge for inter-frame motion correction, especially for early frames where intensity-based image registration techniques often fail. To address this issue, we propose a novel method called Temporally and Anatomically Informed Generative Adversarial Network (TAI-GAN) that utilizes an all-to-one mapping to convert early frames into those with tracer distribution similar to the last reference frame. The TAI-GAN consists of a feature-wise linear modulation layer that encodes channel-wise parameters generated from temporal information and rough cardiac segmentation masks with local shifts that serve as anatomical information. Our proposed method was evaluated on a clinical 82Rb PET dataset, and the results show that our TAI-GAN can produce converted early frames with high image quality, comparable to the real reference frames. After TAI-GAN conversion, the motion estimation accuracy and subsequent myocardial blood flow (MBF) quantification with both conventional and deep learning-based motion correction methods were improved compared to using the original frames. The code is available at https://github.com/gxq1998/TAI-GAN. [Display omitted] •Temporally + Anatomically Informed GAN frame conversion in cardiac motion correction•Feature-wise linear modulation to address tracer distribution variability•Dual-channel input with anatomical locators to address spatial mismatch•Improved current motion correction methods and the downstream MBF quantification
ISSN:1361-8415
1361-8423
1361-8423
DOI:10.1016/j.media.2024.103190