Loading…

DC2Anet: Generating Lumbar Spine MR Images from CT Scan Data Based on Semi-Supervised Learning

Magnetic resonance imaging (MRI) plays a significant role in the diagnosis of lumbar disc disease. However, the use of MRI is limited because of its high cost and significant operating and processing time. More importantly, MRI is contraindicated for some patients with claustrophobia or cardiac pace...

Full description

Saved in:
Bibliographic Details
Published in:Applied sciences 2019-06, Vol.9 (12), p.2521
Main Authors: Jin, Cheng-Bin, Kim, Hakil, Liu, Mingjie, Han, In Ho, Lee, Jae Il, Lee, Jung Hwan, Joo, Seongsu, Park, Eunsik, Ahn, Young Saem, Cui, Xuenan
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Magnetic resonance imaging (MRI) plays a significant role in the diagnosis of lumbar disc disease. However, the use of MRI is limited because of its high cost and significant operating and processing time. More importantly, MRI is contraindicated for some patients with claustrophobia or cardiac pacemakers due to the possibility of injury. In contrast, computed tomography (CT) scans are much less expensive, are faster, and do not face the same limitations. In this paper, we propose a method for estimating lumbar spine MR images based on CT images using a novel objective function and a dual cycle-consistent adversarial network (DC 2 Anet) with semi-supervised learning. The objective function includes six independent loss terms to balance quantitative and qualitative losses, enabling the generation of a realistic and accurate synthetic MR image. DC 2 Anet is also capable of semi-supervised learning, and the network is general enough for supervised or unsupervised setups. Experimental results prove that the method is accurate, being able to construct MR images that closely approximate reference MR images, while also outperforming four other state-of-the-art methods.
ISSN:2076-3417
2076-3417
DOI:10.3390/app9122521