Loading…

Multi-Contrast MRI Reconstruction Based on Frequency Domain Separation and Cross-Self-Attention

Multi-contrast magnetic resonance imaging can bring convenience to medical diagnosis and decision-making. However, due to the long acquisition time, the images are easily disturbed by motion artifacts. To accelerate MRI, the approach of under-sampling k-space data has been used. This approach leads...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2024, Vol.12, p.55062-55076
Main Authors: Qiu, Yiran, Zhang, Haotian, Ma, Qiaoyu, Yang, Guangsong, Lai, Zongying
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multi-contrast magnetic resonance imaging can bring convenience to medical diagnosis and decision-making. However, due to the long acquisition time, the images are easily disturbed by motion artifacts. To accelerate MRI, the approach of under-sampling k-space data has been used. This approach leads to the deterioration of MR image quality. To restore images, we proposed a Residual Convolution Swin Transformer Block (RCSTB), which includes local feature representation in a convolutional neural network module and global long-term feature extraction in the transformer. Additionally, similarities reflected by structure information in high frequency among multi-contrast magnetic resonance images are separated and are fused with a cross-self-attention mechanism to assist reconstruction of under-sampled magnetic resonance images. The residual information in frequency domain separation from the RCSTB reconstructed images was used to complete the final reconstruction. Experimental results show that our method outperforms state-of-art in Peak Signal-to-Noise Ratio (PSNR), Structural Similarity Index Measure (SSIM), and Relative L2 Norm Error (RLNE) under different sampling patterns. The reconstruction of details has achieved visible improvements to the naked eye.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2024.3388379