Loading…
Performance of novel deep learning network with the incorporation of the automatic segmentation network for diagnosis of breast cancer in automated breast ultrasound
Objective To develop novel deep learning network (DLN) with the incorporation of the automatic segmentation network (ASN) for morphological analysis and determined the performance for diagnosis breast cancer in automated breast ultrasound (ABUS). Methods A total of 769 breast tumors were enrolled in...
Saved in:
Published in: | European radiology 2022-10, Vol.32 (10), p.7163-7172 |
---|---|
Main Authors: | , , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Objective
To develop novel deep learning network (DLN) with the incorporation of the automatic segmentation network (ASN) for morphological analysis and determined the performance for diagnosis breast cancer in automated breast ultrasound (ABUS).
Methods
A total of 769 breast tumors were enrolled in this study and were randomly divided into training set and test set at 600 vs. 169. The novel DLNs (Resent v2, ResNet50 v2, ResNet101 v2) added a new ASN to the traditional ResNet networks and extracted morphological information of breast tumors. The accuracy, sensitivity, specificity, positive predictive value (PPV), negative predictive value (NPV), area under the receiver operating characteristic (ROC) curve (AUC), and average precision (AP) were calculated. The diagnostic performances of novel DLNs were compared with those of two radiologists with different experience.
Results
The ResNet34 v2 model had higher specificity (76.81%) and PPV (82.22%) than the other two, the ResNet50 v2 model had higher accuracy (78.11%) and NPV (72.86%), and the ResNet101 v2 model had higher sensitivity (85.00%). According to the AUCs and APs, the novel ResNet101 v2 model produced the best result (AUC 0.85 and AP 0.90) compared with the remaining five DLNs. Compared with the novice radiologist, the novel DLNs performed better. The F1 score was increased from 0.77 to 0.78, 0.81, and 0.82 by three novel DLNs. However, their diagnostic performance was worse than that of the experienced radiologist.
Conclusions
The novel DLNs performed better than traditional DLNs and may be helpful for novice radiologists to improve their diagnostic performance of breast cancer in ABUS.
Key Points
•
A novel automatic segmentation network to extract morphological information was successfully developed and implemented with ResNet deep learning networks.
•
The novel deep learning networks in our research performed better than the traditional deep learning networks in the diagnosis of breast cancer using ABUS images.
•
The novel deep learning networks in our research may be useful for novice radiologists to improve diagnostic performance. |
---|---|
ISSN: | 1432-1084 0938-7994 1432-1084 |
DOI: | 10.1007/s00330-022-08836-x |