Loading…

Attention Dense-U-Net for Automatic Breast Mass Segmentation in Digital Mammogram

Breast mass is one of the most distinctive signs for the diagnosis of breast cancer, and the accurate segmentation of masses is critical for improving the accuracy of breast cancer detection and reducing the mortality rate. It is time-consuming for a physician to review the film. Besides, traditiona...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.59037-59047
Main Authors: Li, Shuyi, Dong, Min, Du, Guangming, Mu, Xiaomin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Breast mass is one of the most distinctive signs for the diagnosis of breast cancer, and the accurate segmentation of masses is critical for improving the accuracy of breast cancer detection and reducing the mortality rate. It is time-consuming for a physician to review the film. Besides, traditional medical segmentation techniques often require prior knowledge or manual extraction of features, which often lead to a subjective diagnosis. Therefore, developing an automatic image segmentation method is important for clinical application. In this paper, a fully automatic method based on deep learning for breast mass segmentation is proposed, which combines densely connected U-Net with attention gates (AGs). It contains an encoder and a decoder. The encoder is a densely connected convolutional network and the decoder is the decoder of U-Net integrated with AGs. The proposed method is tested on the public and authoritative database-Digital Database for Screening Mammography (DDSM) database. F1-score, mean intersection over union, sensitivity, specificity, and overall accuracy are used to evaluate the effectiveness of the proposed method. The experimental results show that dense U-Net integrated AGs achieve better segmentation results than U-Net, attention U-Net, DenseNet, and state-of-the-art methods.
ISSN:2169-3536
2169-3536
DOI:10.1109/ACCESS.2019.2914873