Loading…
Fusing texture, shape and deep model-learned information at decision level for automated classification of lung nodules on chest CT
•Using nodule heterogeneity (texture/shape) features and representation learned by a deep model.•Constructing an ensemble classifier using back propagation neural network and AdaBoost.•Fusing the decisions made by 3 ensemble classifiers, which are trained on 3 features, respectively.•Outperforming t...
Saved in:
Published in: | Information fusion 2018-07, Vol.42, p.102-110 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | •Using nodule heterogeneity (texture/shape) features and representation learned by a deep model.•Constructing an ensemble classifier using back propagation neural network and AdaBoost.•Fusing the decisions made by 3 ensemble classifiers, which are trained on 3 features, respectively.•Outperforming three state-of-the-art nodule classification approaches on the LIDC-IDRI dataset.
The separation of malignant from benign lung nodules on chest computed tomography (CT) is important for the early detection of lung cancer, since early detection and management offer the best chance for cure. Although deep learning methods have recently produced a marked improvement in image classification there are still challenges as these methods contain myriad parameters and require large-scale training sets that are not usually available for most routine medical imaging studies. In this paper, we propose an algorithm for lung nodule classification that fuses the texture, shape and deep model-learned information (Fuse-TSD) at the decision level. This algorithm employs a gray level co-occurrence matrix (GLCM)-based texture descriptor, a Fourier shape descriptor to characterize the heterogeneity of nodules and a deep convolutional neural network (DCNN) to automatically learn the feature representation of nodules on a slice-by-slice basis. It trains an AdaBoosted back propagation neural network (BPNN) using each feature type and fuses the decisions made by three classifiers to differentiate nodules. We evaluated this algorithm against three approaches on the LIDC-IDRI dataset. When the nodules with a composite malignancy rate 3 were discarded, regarded as benign or regarded as malignant, our Fuse-TSD algorithm achieved an AUC of 96.65%, 94.45% and 81.24%, respectively, which was substantially higher than the AUC obtained by other approaches.
[Display omitted] |
---|---|
ISSN: | 1566-2535 1872-6305 |
DOI: | 10.1016/j.inffus.2017.10.005 |