Loading…
MR-based Attenuation Correction for Brain PET Using 3D Cycle-Consistent Adversarial Network
Attenuation correction (AC) is important for the quantitative merits of positron emission tomography (PET). However, attenuation coefficients cannot be derived from magnetic resonance (MR) images directly for PET/MR systems. In this work, we aimed to derive continuous AC maps from Dixon MR images wi...
Saved in:
Published in: | IEEE transactions on radiation and plasma medical sciences 2021-03, Vol.5 (2), p.185-192 |
---|---|
Main Authors: | , , , , , , |
Format: | Article |
Language: | English |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Attenuation correction (AC) is important for the quantitative merits of positron emission tomography (PET). However, attenuation coefficients cannot be derived from magnetic resonance (MR) images directly for PET/MR systems. In this work, we aimed to derive continuous AC maps from Dixon MR images without the requirement of MR and computed tomography (CT) image registration. To achieve this, a 3D generative adversarial network with both discriminative and cycle-consistency loss (Cycle-GAN) was developed. The modified 3D U-net was employed as the structure of the generative networks to generate the pseudo CT/MR images. The 3D patch-based discriminative networks were used to distinguish the generated pseudo CT/MR images from the true CT/MR images. To evaluate its performance, datasets from 32 patients were used in the experiment. The Dixon segmentation and atlas methods provided by the vendor and the convolutional neural network (CNN) method which utilized registered MR and CT images were employed as the reference methods. Dice coefficients of the pseudo-CT image and the regional quantification in the reconstructed PET images were compared. Results show that the Cycle-GAN framework can generate better AC compared to the Dixon segmentation and atlas methods, and shows comparable performance compared to the CNN method. |
---|---|
ISSN: | 2469-7311 2469-7303 |
DOI: | 10.1109/trpms.2020.3006844 |