Loading…

Attention Dense-U-Net for Automatic Breast Mass Segmentation in Digital Mammogram

Breast mass is one of the most distinctive signs for the diagnosis of breast cancer, and the accurate segmentation of masses is critical for improving the accuracy of breast cancer detection and reducing the mortality rate. It is time-consuming for a physician to review the film. Besides, traditiona...

Full description

Saved in:
Bibliographic Details
Published in:IEEE access 2019, Vol.7, p.59037-59047
Main Authors: Li, Shuyi, Dong, Min, Du, Guangming, Mu, Xiaomin
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c408t-e97c9c4f1eca3d4ff4d5462683f33c33103262f179016166b0a9c28c70c39c063
cites cdi_FETCH-LOGICAL-c408t-e97c9c4f1eca3d4ff4d5462683f33c33103262f179016166b0a9c28c70c39c063
container_end_page 59047
container_issue
container_start_page 59037
container_title IEEE access
container_volume 7
creator Li, Shuyi
Dong, Min
Du, Guangming
Mu, Xiaomin
description Breast mass is one of the most distinctive signs for the diagnosis of breast cancer, and the accurate segmentation of masses is critical for improving the accuracy of breast cancer detection and reducing the mortality rate. It is time-consuming for a physician to review the film. Besides, traditional medical segmentation techniques often require prior knowledge or manual extraction of features, which often lead to a subjective diagnosis. Therefore, developing an automatic image segmentation method is important for clinical application. In this paper, a fully automatic method based on deep learning for breast mass segmentation is proposed, which combines densely connected U-Net with attention gates (AGs). It contains an encoder and a decoder. The encoder is a densely connected convolutional network and the decoder is the decoder of U-Net integrated with AGs. The proposed method is tested on the public and authoritative database-Digital Database for Screening Mammography (DDSM) database. F1-score, mean intersection over union, sensitivity, specificity, and overall accuracy are used to evaluate the effectiveness of the proposed method. The experimental results show that dense U-Net integrated AGs achieve better segmentation results than U-Net, attention U-Net, DenseNet, and state-of-the-art methods.
doi_str_mv 10.1109/ACCESS.2019.2914873
format article
fullrecord <record><control><sourceid>proquest_ieee_</sourceid><recordid>TN_cdi_proquest_journals_2455613259</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><ieee_id>8708192</ieee_id><doaj_id>oai_doaj_org_article_08ee558670c34e4ca8c9d0892ca65a3b</doaj_id><sourcerecordid>2455613259</sourcerecordid><originalsourceid>FETCH-LOGICAL-c408t-e97c9c4f1eca3d4ff4d5462683f33c33103262f179016166b0a9c28c70c39c063</originalsourceid><addsrcrecordid>eNpNkU1PAjEQhjdGE4nyC7hs4nmx39seEVFJUGOQc1O6s5slLMW2HPz3FpYQe2kzneedSZ4sG2E0xhipx8l0OlsuxwRhNSYKM1nSq2xAsFAF5VRc_3vfZsMQNigdmUq8HGRfkxhhF1u3y59hF6BYFR8Q89r5fHKIrjOxtfmTBxNi_m5CyJfQdAkwJ6RNVNu00WzTZ9e5xpvuPrupzTbA8HzfZauX2ff0rVh8vs6nk0VhGZKxAFVaZVmNwRpasbpmFWeCCElrSi2lGFEiSI1LhbDAQqyRUZZIWyJLlUWC3mXzPrdyZqP3vu2M_9XOtPpUcL7Rxqftt6CRBOBciiPLgFkjraqQVMQawQ1dp6yHPmvv3c8BQtQbd_C7tL4mjHOBKeEqddG-y3oXgof6MhUjfVShexX6qEKfVSRq1FMtAFwIWSYFitA_l9mC9A</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2455613259</pqid></control><display><type>article</type><title>Attention Dense-U-Net for Automatic Breast Mass Segmentation in Digital Mammogram</title><source>IEEE Open Access Journals</source><creator>Li, Shuyi ; Dong, Min ; Du, Guangming ; Mu, Xiaomin</creator><creatorcontrib>Li, Shuyi ; Dong, Min ; Du, Guangming ; Mu, Xiaomin</creatorcontrib><description>Breast mass is one of the most distinctive signs for the diagnosis of breast cancer, and the accurate segmentation of masses is critical for improving the accuracy of breast cancer detection and reducing the mortality rate. It is time-consuming for a physician to review the film. Besides, traditional medical segmentation techniques often require prior knowledge or manual extraction of features, which often lead to a subjective diagnosis. Therefore, developing an automatic image segmentation method is important for clinical application. In this paper, a fully automatic method based on deep learning for breast mass segmentation is proposed, which combines densely connected U-Net with attention gates (AGs). It contains an encoder and a decoder. The encoder is a densely connected convolutional network and the decoder is the decoder of U-Net integrated with AGs. The proposed method is tested on the public and authoritative database-Digital Database for Screening Mammography (DDSM) database. F1-score, mean intersection over union, sensitivity, specificity, and overall accuracy are used to evaluate the effectiveness of the proposed method. The experimental results show that dense U-Net integrated AGs achieve better segmentation results than U-Net, attention U-Net, DenseNet, and state-of-the-art methods.</description><identifier>ISSN: 2169-3536</identifier><identifier>EISSN: 2169-3536</identifier><identifier>DOI: 10.1109/ACCESS.2019.2914873</identifier><identifier>CODEN: IAECCG</identifier><language>eng</language><publisher>Piscataway: IEEE</publisher><subject>attention gates ; biomedical image processing ; Biomedical imaging ; Breast cancer ; Breast masses segmentation ; Coders ; deep learning ; densely connected convolutional network ; Diagnosis ; Feature extraction ; Image segmentation ; Mammography ; Shape</subject><ispartof>IEEE access, 2019, Vol.7, p.59037-59047</ispartof><rights>Copyright The Institute of Electrical and Electronics Engineers, Inc. (IEEE) 2019</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c408t-e97c9c4f1eca3d4ff4d5462683f33c33103262f179016166b0a9c28c70c39c063</citedby><cites>FETCH-LOGICAL-c408t-e97c9c4f1eca3d4ff4d5462683f33c33103262f179016166b0a9c28c70c39c063</cites><orcidid>0000-0001-7758-7856</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktohtml>$$Uhttps://ieeexplore.ieee.org/document/8708192$$EHTML$$P50$$Gieee$$Hfree_for_read</linktohtml><link.rule.ids>314,780,784,4024,27633,27923,27924,27925,54933</link.rule.ids></links><search><creatorcontrib>Li, Shuyi</creatorcontrib><creatorcontrib>Dong, Min</creatorcontrib><creatorcontrib>Du, Guangming</creatorcontrib><creatorcontrib>Mu, Xiaomin</creatorcontrib><title>Attention Dense-U-Net for Automatic Breast Mass Segmentation in Digital Mammogram</title><title>IEEE access</title><addtitle>Access</addtitle><description>Breast mass is one of the most distinctive signs for the diagnosis of breast cancer, and the accurate segmentation of masses is critical for improving the accuracy of breast cancer detection and reducing the mortality rate. It is time-consuming for a physician to review the film. Besides, traditional medical segmentation techniques often require prior knowledge or manual extraction of features, which often lead to a subjective diagnosis. Therefore, developing an automatic image segmentation method is important for clinical application. In this paper, a fully automatic method based on deep learning for breast mass segmentation is proposed, which combines densely connected U-Net with attention gates (AGs). It contains an encoder and a decoder. The encoder is a densely connected convolutional network and the decoder is the decoder of U-Net integrated with AGs. The proposed method is tested on the public and authoritative database-Digital Database for Screening Mammography (DDSM) database. F1-score, mean intersection over union, sensitivity, specificity, and overall accuracy are used to evaluate the effectiveness of the proposed method. The experimental results show that dense U-Net integrated AGs achieve better segmentation results than U-Net, attention U-Net, DenseNet, and state-of-the-art methods.</description><subject>attention gates</subject><subject>biomedical image processing</subject><subject>Biomedical imaging</subject><subject>Breast cancer</subject><subject>Breast masses segmentation</subject><subject>Coders</subject><subject>deep learning</subject><subject>densely connected convolutional network</subject><subject>Diagnosis</subject><subject>Feature extraction</subject><subject>Image segmentation</subject><subject>Mammography</subject><subject>Shape</subject><issn>2169-3536</issn><issn>2169-3536</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ESBDL</sourceid><sourceid>DOA</sourceid><recordid>eNpNkU1PAjEQhjdGE4nyC7hs4nmx39seEVFJUGOQc1O6s5slLMW2HPz3FpYQe2kzneedSZ4sG2E0xhipx8l0OlsuxwRhNSYKM1nSq2xAsFAF5VRc_3vfZsMQNigdmUq8HGRfkxhhF1u3y59hF6BYFR8Q89r5fHKIrjOxtfmTBxNi_m5CyJfQdAkwJ6RNVNu00WzTZ9e5xpvuPrupzTbA8HzfZauX2ff0rVh8vs6nk0VhGZKxAFVaZVmNwRpasbpmFWeCCElrSi2lGFEiSI1LhbDAQqyRUZZIWyJLlUWC3mXzPrdyZqP3vu2M_9XOtPpUcL7Rxqftt6CRBOBciiPLgFkjraqQVMQawQ1dp6yHPmvv3c8BQtQbd_C7tL4mjHOBKeEqddG-y3oXgof6MhUjfVShexX6qEKfVSRq1FMtAFwIWSYFitA_l9mC9A</recordid><startdate>2019</startdate><enddate>2019</enddate><creator>Li, Shuyi</creator><creator>Dong, Min</creator><creator>Du, Guangming</creator><creator>Mu, Xiaomin</creator><general>IEEE</general><general>The Institute of Electrical and Electronics Engineers, Inc. (IEEE)</general><scope>97E</scope><scope>ESBDL</scope><scope>RIA</scope><scope>RIE</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>7SP</scope><scope>7SR</scope><scope>8BQ</scope><scope>8FD</scope><scope>JG9</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0001-7758-7856</orcidid></search><sort><creationdate>2019</creationdate><title>Attention Dense-U-Net for Automatic Breast Mass Segmentation in Digital Mammogram</title><author>Li, Shuyi ; Dong, Min ; Du, Guangming ; Mu, Xiaomin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c408t-e97c9c4f1eca3d4ff4d5462683f33c33103262f179016166b0a9c28c70c39c063</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>attention gates</topic><topic>biomedical image processing</topic><topic>Biomedical imaging</topic><topic>Breast cancer</topic><topic>Breast masses segmentation</topic><topic>Coders</topic><topic>deep learning</topic><topic>densely connected convolutional network</topic><topic>Diagnosis</topic><topic>Feature extraction</topic><topic>Image segmentation</topic><topic>Mammography</topic><topic>Shape</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Li, Shuyi</creatorcontrib><creatorcontrib>Dong, Min</creatorcontrib><creatorcontrib>Du, Guangming</creatorcontrib><creatorcontrib>Mu, Xiaomin</creatorcontrib><collection>IEEE All-Society Periodicals Package (ASPP) 2005-present</collection><collection>IEEE Open Access Journals</collection><collection>IEEE All-Society Periodicals Package (ASPP) 1998-Present</collection><collection>IEEE Xplore</collection><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Electronics &amp; Communications Abstracts</collection><collection>Engineered Materials Abstracts</collection><collection>METADEX</collection><collection>Technology Research Database</collection><collection>Materials Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>IEEE access</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Li, Shuyi</au><au>Dong, Min</au><au>Du, Guangming</au><au>Mu, Xiaomin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Attention Dense-U-Net for Automatic Breast Mass Segmentation in Digital Mammogram</atitle><jtitle>IEEE access</jtitle><stitle>Access</stitle><date>2019</date><risdate>2019</risdate><volume>7</volume><spage>59037</spage><epage>59047</epage><pages>59037-59047</pages><issn>2169-3536</issn><eissn>2169-3536</eissn><coden>IAECCG</coden><abstract>Breast mass is one of the most distinctive signs for the diagnosis of breast cancer, and the accurate segmentation of masses is critical for improving the accuracy of breast cancer detection and reducing the mortality rate. It is time-consuming for a physician to review the film. Besides, traditional medical segmentation techniques often require prior knowledge or manual extraction of features, which often lead to a subjective diagnosis. Therefore, developing an automatic image segmentation method is important for clinical application. In this paper, a fully automatic method based on deep learning for breast mass segmentation is proposed, which combines densely connected U-Net with attention gates (AGs). It contains an encoder and a decoder. The encoder is a densely connected convolutional network and the decoder is the decoder of U-Net integrated with AGs. The proposed method is tested on the public and authoritative database-Digital Database for Screening Mammography (DDSM) database. F1-score, mean intersection over union, sensitivity, specificity, and overall accuracy are used to evaluate the effectiveness of the proposed method. The experimental results show that dense U-Net integrated AGs achieve better segmentation results than U-Net, attention U-Net, DenseNet, and state-of-the-art methods.</abstract><cop>Piscataway</cop><pub>IEEE</pub><doi>10.1109/ACCESS.2019.2914873</doi><tpages>11</tpages><orcidid>https://orcid.org/0000-0001-7758-7856</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 2169-3536
ispartof IEEE access, 2019, Vol.7, p.59037-59047
issn 2169-3536
2169-3536
language eng
recordid cdi_proquest_journals_2455613259
source IEEE Open Access Journals
subjects attention gates
biomedical image processing
Biomedical imaging
Breast cancer
Breast masses segmentation
Coders
deep learning
densely connected convolutional network
Diagnosis
Feature extraction
Image segmentation
Mammography
Shape
title Attention Dense-U-Net for Automatic Breast Mass Segmentation in Digital Mammogram
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T19%3A55%3A10IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_ieee_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Attention%20Dense-U-Net%20for%20Automatic%20Breast%20Mass%20Segmentation%20in%20Digital%20Mammogram&rft.jtitle=IEEE%20access&rft.au=Li,%20Shuyi&rft.date=2019&rft.volume=7&rft.spage=59037&rft.epage=59047&rft.pages=59037-59047&rft.issn=2169-3536&rft.eissn=2169-3536&rft.coden=IAECCG&rft_id=info:doi/10.1109/ACCESS.2019.2914873&rft_dat=%3Cproquest_ieee_%3E2455613259%3C/proquest_ieee_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c408t-e97c9c4f1eca3d4ff4d5462683f33c33103262f179016166b0a9c28c70c39c063%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2455613259&rft_id=info:pmid/&rft_ieee_id=8708192&rfr_iscdi=true