Loading…

Dual-mode artificially-intelligent diagnosis of breast tumours in shear-wave elastography and B-mode ultrasound using deep polynomial networks

•AI-based diagnosis for breast cancer is proposed using deep polynomial network (DPN).•Dual-modal methods outperforms single-modal ones for breast tumor classification.•DPN achieves 97.8% sensitivity, 94.1% specificity, 95.6% accuracy, and 0.961 AUC.•DPN outperforms conventional feature learning met...

Full description

Saved in:
Bibliographic Details
Published in:Medical engineering & physics 2019-02, Vol.64, p.1-6
Main Authors: Zhang, Qi, Song, Shuang, Xiao, Yang, Chen, Shuai, Shi, Jun, Zheng, Hairong
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•AI-based diagnosis for breast cancer is proposed using deep polynomial network (DPN).•Dual-modal methods outperforms single-modal ones for breast tumor classification.•DPN achieves 97.8% sensitivity, 94.1% specificity, 95.6% accuracy, and 0.961 AUC.•DPN outperforms conventional feature learning methods. The main goal of this study is to build an artificial intelligence (AI) architecture for automated extraction of dual-modal image features from both shear-wave elastography (SWE) and B-mode ultrasound, and to evaluate the AI architecture for classification between benign and malignant breast tumors. In this AI architecture, ultrasound images were segmented by the reaction diffusion level set model combined with the Gabor-based anisotropic diffusion algorithm. Then morphological features and texture features were extracted from SWE and B-mode ultrasound images at the contourlet domain. Finally, we employed a framework for feature learning and classification with the deep polynomial network (DPN) on dual-modal features to distinguish between malignant and benign breast tumors. With the leave-one-out cross validation, the DPN method on dual-modal features achieved a sensitivity of 97.8%, a specificity of 94.1%, an accuracy of 95.6%, a Youden's index of 91.9% and an area under the receiver operating characteristic curve of 0.961, which was superior to the classic single-modal methods, and the dual-modal methods using the principal component analysis and multiple kernel learning. These results have demonstrated that the dual-modal AI-based technique with DPN has the potential for breast tumor classification in future clinical practice.
ISSN:1350-4533
1873-4030
DOI:10.1016/j.medengphy.2018.12.005