Loading…
Deep Learning Framework for Alzheimer's Disease Diagnosis via 3D-CNN and FSBi-LSTM
Alzheimer's disease (AD) is an irreversible progressive neurodegenerative disorder. Mild cognitive impairment (MCI) is the prodromal state of AD, which is further classified into a progressive state (i.e., pMCI) and a stable state (i.e., sMCI). With the development of deep learning, the convolu...
Saved in:
Published in: | IEEE access 2019, Vol.7, p.63605-63618 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Alzheimer's disease (AD) is an irreversible progressive neurodegenerative disorder. Mild cognitive impairment (MCI) is the prodromal state of AD, which is further classified into a progressive state (i.e., pMCI) and a stable state (i.e., sMCI). With the development of deep learning, the convolutional neural networks (CNNs) have made great progress in image recognition using magnetic resonance imaging (MRI) and positron emission tomography (PET) for AD diagnosis. However, due to the limited availability of these imaging data, it is still challenging to effectively use CNNs for AD diagnosis. Toward this end, we design a novel deep learning framework. Specifically, the virtues of 3D-CNN and fully stacked bidirectional long short-term memory (FSBi-LSTM) are exploited in our framework. First, we design a 3D-CNN architecture to derive deep feature representation from both MRI and PET. Then, we apply FSBi-LSTM on the hidden spatial information from deep feature maps to further improve its performance. Finally, we validate our method on the AD neuroimaging initiative (ADNI) dataset. Our method achieves average accuracies of 94.82%, 86.36%, and 65.35% for differentiating AD from normal control (NC), pMCI from NC, and sMCI from NC, respectively, and outperforms the related algorithms in the literature. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2019.2913847 |