Loading…

A comparative study of pre-trained convolutional neural networks for semantic segmentation of breast tumors in ultrasound

The automatic segmentation of breast tumors in ultrasound (BUS) has recently been addressed using convolutional neural networks (CNN). These CNN-based approaches generally modify a previously proposed CNN architecture or they design a new architecture using CNN ensembles. Although these methods have...

Full description

Saved in:
Bibliographic Details
Published in:Computers in biology and medicine 2020-11, Vol.126, p.104036-104036, Article 104036
Main Authors: Gómez-Flores, Wilfrido, Coelho de Albuquerque Pereira, Wagner
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c402t-f602bb20d6cd5c7e6db38157459e2e3ed99ffbbcb18b3c2dd5b36fb70d3f66ae3
cites cdi_FETCH-LOGICAL-c402t-f602bb20d6cd5c7e6db38157459e2e3ed99ffbbcb18b3c2dd5b36fb70d3f66ae3
container_end_page 104036
container_issue
container_start_page 104036
container_title Computers in biology and medicine
container_volume 126
creator Gómez-Flores, Wilfrido
Coelho de Albuquerque Pereira, Wagner
description The automatic segmentation of breast tumors in ultrasound (BUS) has recently been addressed using convolutional neural networks (CNN). These CNN-based approaches generally modify a previously proposed CNN architecture or they design a new architecture using CNN ensembles. Although these methods have reported satisfactory results, the trained CNN architectures are often unavailable for reproducibility purposes. Moreover, these methods commonly learn from small BUS datasets with particular properties, which limits generalization in new cases. This paper evaluates four public CNN-based semantic segmentation models that were developed by the computer vision community, as follows: (1) Fully Convolutional Network (FCN) with AlexNet network, (2) U-Net network, (3) SegNet using VGG16 and VGG19 networks, and (4) DeepLabV3+ using ResNet18, ResNet50, MobileNet-V2, and Xception networks. By transfer learning, these CNNs are fine-tuned to segment BUS images in normal and tumoral pixels. The goal is to select a potential CNN-based segmentation model to be further used in computer-aided diagnosis (CAD) systems. The main significance of this study is the comparison of eight well-established CNN architectures using a more extensive BUS dataset than those used by approaches that are currently found in the literature. More than 3000 BUS images acquired from seven US machine models are used for training and validation. The F1-score (F1s) and the Intersection over Union (IoU) quantify the segmentation performance. The segmentation models based on SegNet and DeepLabV3+ obtain the best results with F1s>0.90 and IoU>0.81. In the case of U-Net, the segmentation performance is F1s=0.89 and IoU=0.80, whereas FCN-AlexNet attains the lowest results with F1s=0.84 and IoU=0.73. In particular, ResNet18 obtains F1s=0.905 and IoU=0.827 and requires less training time among SegNet and DeepLabV3+ networks. Hence, ResNet18 is a potential candidate for implementing fully automated end-to-end CAD systems. The CNN models generated in this study are available to researchers at https://github.com/wgomezf/CNN-BUS-segment, which attempts to impact the fair comparison with other CNN-based segmentation approaches for BUS images. •Four deep semantic segmentation models for automatic segmentation of breast tumors in ultrasound are compared experimentally.•The impact of the type of convolutional neural network architecture on tumor segmentation quality is investigated.•Eight pre-trained convolutional neu
doi_str_mv 10.1016/j.compbiomed.2020.104036
format article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_2451851257</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S001048252030367X</els_id><sourcerecordid>2451851257</sourcerecordid><originalsourceid>FETCH-LOGICAL-c402t-f602bb20d6cd5c7e6db38157459e2e3ed99ffbbcb18b3c2dd5b36fb70d3f66ae3</originalsourceid><addsrcrecordid>eNqFkUuLFDEURoMoTjv6FyTgxk21eVRSVctx8AUDbnQd8rglaauSNo-W_vempmcQ3Li6ITn3u9wchDAle0qofHfY27gejY8ruD0jbLvuCZdP0I6Ow9QRwfunaEcIJV0_MnGFXuR8IGSDyHN0xTkRE-PjDp1v8Balky7-BDiX6s44zviYoCtJ-wCuAeEUl1p8DHrBAWq6L-V3TD8znmPCGVYdirft8GOFUPTGbjEmgc4Fl7rGlLEPuC4tNcca3Ev0bNZLhlcP9Rp9__jh2-3n7u7rpy-3N3ed7Qkr3SwJM4YRJ60TdgDpDB-pGHoxAQMObprm2Rhr6Gi4Zc4Jw-VsBuL4LKUGfo3eXnKPKf6qkItafbawLDpArFmxXtBRUCaGhr75Bz3EmtrS91TfU0koa9R4oWyKOSeY1TH5VaezokRtetRB_dWjNj3qoqe1vn4YUM329tj46KMB7y8AtB85eUgqWw_BgvMJbFEu-v9P-QPiMalO</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2454416012</pqid></control><display><type>article</type><title>A comparative study of pre-trained convolutional neural networks for semantic segmentation of breast tumors in ultrasound</title><source>ScienceDirect Freedom Collection</source><creator>Gómez-Flores, Wilfrido ; Coelho de Albuquerque Pereira, Wagner</creator><creatorcontrib>Gómez-Flores, Wilfrido ; Coelho de Albuquerque Pereira, Wagner</creatorcontrib><description>The automatic segmentation of breast tumors in ultrasound (BUS) has recently been addressed using convolutional neural networks (CNN). These CNN-based approaches generally modify a previously proposed CNN architecture or they design a new architecture using CNN ensembles. Although these methods have reported satisfactory results, the trained CNN architectures are often unavailable for reproducibility purposes. Moreover, these methods commonly learn from small BUS datasets with particular properties, which limits generalization in new cases. This paper evaluates four public CNN-based semantic segmentation models that were developed by the computer vision community, as follows: (1) Fully Convolutional Network (FCN) with AlexNet network, (2) U-Net network, (3) SegNet using VGG16 and VGG19 networks, and (4) DeepLabV3+ using ResNet18, ResNet50, MobileNet-V2, and Xception networks. By transfer learning, these CNNs are fine-tuned to segment BUS images in normal and tumoral pixels. The goal is to select a potential CNN-based segmentation model to be further used in computer-aided diagnosis (CAD) systems. The main significance of this study is the comparison of eight well-established CNN architectures using a more extensive BUS dataset than those used by approaches that are currently found in the literature. More than 3000 BUS images acquired from seven US machine models are used for training and validation. The F1-score (F1s) and the Intersection over Union (IoU) quantify the segmentation performance. The segmentation models based on SegNet and DeepLabV3+ obtain the best results with F1s&gt;0.90 and IoU&gt;0.81. In the case of U-Net, the segmentation performance is F1s=0.89 and IoU=0.80, whereas FCN-AlexNet attains the lowest results with F1s=0.84 and IoU=0.73. In particular, ResNet18 obtains F1s=0.905 and IoU=0.827 and requires less training time among SegNet and DeepLabV3+ networks. Hence, ResNet18 is a potential candidate for implementing fully automated end-to-end CAD systems. The CNN models generated in this study are available to researchers at https://github.com/wgomezf/CNN-BUS-segment, which attempts to impact the fair comparison with other CNN-based segmentation approaches for BUS images. •Four deep semantic segmentation models for automatic segmentation of breast tumors in ultrasound are compared experimentally.•The impact of the type of convolutional neural network architecture on tumor segmentation quality is investigated.•Eight pre-trained convolutional neural networks are used as backbone models in deep semantic segmentation models.•The influence of specific properties of the breast ultrasound dataset on segmentation results is analyzed.</description><identifier>ISSN: 0010-4825</identifier><identifier>EISSN: 1879-0534</identifier><identifier>DOI: 10.1016/j.compbiomed.2020.104036</identifier><identifier>PMID: 33059238</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Architecture ; Artificial neural networks ; Automation ; Breast cancer ; Breast tumors ; Breast ultrasound ; Classification ; Comparative studies ; Computer vision ; Convolutional neural networks ; Datasets ; Design modifications ; Image acquisition ; Image processing ; Image segmentation ; Mammography ; Medical screening ; Methods ; Neural networks ; Semantic segmentation ; Semantics ; Training ; Transfer learning ; Tumors ; Ultrasonic imaging ; Ultrasound</subject><ispartof>Computers in biology and medicine, 2020-11, Vol.126, p.104036-104036, Article 104036</ispartof><rights>2020 Elsevier Ltd</rights><rights>Copyright © 2020 Elsevier Ltd. All rights reserved.</rights><rights>2020. Elsevier Ltd</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c402t-f602bb20d6cd5c7e6db38157459e2e3ed99ffbbcb18b3c2dd5b36fb70d3f66ae3</citedby><cites>FETCH-LOGICAL-c402t-f602bb20d6cd5c7e6db38157459e2e3ed99ffbbcb18b3c2dd5b36fb70d3f66ae3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,776,780,27901,27902</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/33059238$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Gómez-Flores, Wilfrido</creatorcontrib><creatorcontrib>Coelho de Albuquerque Pereira, Wagner</creatorcontrib><title>A comparative study of pre-trained convolutional neural networks for semantic segmentation of breast tumors in ultrasound</title><title>Computers in biology and medicine</title><addtitle>Comput Biol Med</addtitle><description>The automatic segmentation of breast tumors in ultrasound (BUS) has recently been addressed using convolutional neural networks (CNN). These CNN-based approaches generally modify a previously proposed CNN architecture or they design a new architecture using CNN ensembles. Although these methods have reported satisfactory results, the trained CNN architectures are often unavailable for reproducibility purposes. Moreover, these methods commonly learn from small BUS datasets with particular properties, which limits generalization in new cases. This paper evaluates four public CNN-based semantic segmentation models that were developed by the computer vision community, as follows: (1) Fully Convolutional Network (FCN) with AlexNet network, (2) U-Net network, (3) SegNet using VGG16 and VGG19 networks, and (4) DeepLabV3+ using ResNet18, ResNet50, MobileNet-V2, and Xception networks. By transfer learning, these CNNs are fine-tuned to segment BUS images in normal and tumoral pixels. The goal is to select a potential CNN-based segmentation model to be further used in computer-aided diagnosis (CAD) systems. The main significance of this study is the comparison of eight well-established CNN architectures using a more extensive BUS dataset than those used by approaches that are currently found in the literature. More than 3000 BUS images acquired from seven US machine models are used for training and validation. The F1-score (F1s) and the Intersection over Union (IoU) quantify the segmentation performance. The segmentation models based on SegNet and DeepLabV3+ obtain the best results with F1s&gt;0.90 and IoU&gt;0.81. In the case of U-Net, the segmentation performance is F1s=0.89 and IoU=0.80, whereas FCN-AlexNet attains the lowest results with F1s=0.84 and IoU=0.73. In particular, ResNet18 obtains F1s=0.905 and IoU=0.827 and requires less training time among SegNet and DeepLabV3+ networks. Hence, ResNet18 is a potential candidate for implementing fully automated end-to-end CAD systems. The CNN models generated in this study are available to researchers at https://github.com/wgomezf/CNN-BUS-segment, which attempts to impact the fair comparison with other CNN-based segmentation approaches for BUS images. •Four deep semantic segmentation models for automatic segmentation of breast tumors in ultrasound are compared experimentally.•The impact of the type of convolutional neural network architecture on tumor segmentation quality is investigated.•Eight pre-trained convolutional neural networks are used as backbone models in deep semantic segmentation models.•The influence of specific properties of the breast ultrasound dataset on segmentation results is analyzed.</description><subject>Architecture</subject><subject>Artificial neural networks</subject><subject>Automation</subject><subject>Breast cancer</subject><subject>Breast tumors</subject><subject>Breast ultrasound</subject><subject>Classification</subject><subject>Comparative studies</subject><subject>Computer vision</subject><subject>Convolutional neural networks</subject><subject>Datasets</subject><subject>Design modifications</subject><subject>Image acquisition</subject><subject>Image processing</subject><subject>Image segmentation</subject><subject>Mammography</subject><subject>Medical screening</subject><subject>Methods</subject><subject>Neural networks</subject><subject>Semantic segmentation</subject><subject>Semantics</subject><subject>Training</subject><subject>Transfer learning</subject><subject>Tumors</subject><subject>Ultrasonic imaging</subject><subject>Ultrasound</subject><issn>0010-4825</issn><issn>1879-0534</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><recordid>eNqFkUuLFDEURoMoTjv6FyTgxk21eVRSVctx8AUDbnQd8rglaauSNo-W_vempmcQ3Li6ITn3u9wchDAle0qofHfY27gejY8ruD0jbLvuCZdP0I6Ow9QRwfunaEcIJV0_MnGFXuR8IGSDyHN0xTkRE-PjDp1v8Balky7-BDiX6s44zviYoCtJ-wCuAeEUl1p8DHrBAWq6L-V3TD8znmPCGVYdirft8GOFUPTGbjEmgc4Fl7rGlLEPuC4tNcca3Ev0bNZLhlcP9Rp9__jh2-3n7u7rpy-3N3ed7Qkr3SwJM4YRJ60TdgDpDB-pGHoxAQMObprm2Rhr6Gi4Zc4Jw-VsBuL4LKUGfo3eXnKPKf6qkItafbawLDpArFmxXtBRUCaGhr75Bz3EmtrS91TfU0koa9R4oWyKOSeY1TH5VaezokRtetRB_dWjNj3qoqe1vn4YUM329tj46KMB7y8AtB85eUgqWw_BgvMJbFEu-v9P-QPiMalO</recordid><startdate>202011</startdate><enddate>202011</enddate><creator>Gómez-Flores, Wilfrido</creator><creator>Coelho de Albuquerque Pereira, Wagner</creator><general>Elsevier Ltd</general><general>Elsevier Limited</general><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KB0</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M2O</scope><scope>M7P</scope><scope>M7Z</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope></search><sort><creationdate>202011</creationdate><title>A comparative study of pre-trained convolutional neural networks for semantic segmentation of breast tumors in ultrasound</title><author>Gómez-Flores, Wilfrido ; Coelho de Albuquerque Pereira, Wagner</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c402t-f602bb20d6cd5c7e6db38157459e2e3ed99ffbbcb18b3c2dd5b36fb70d3f66ae3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Architecture</topic><topic>Artificial neural networks</topic><topic>Automation</topic><topic>Breast cancer</topic><topic>Breast tumors</topic><topic>Breast ultrasound</topic><topic>Classification</topic><topic>Comparative studies</topic><topic>Computer vision</topic><topic>Convolutional neural networks</topic><topic>Datasets</topic><topic>Design modifications</topic><topic>Image acquisition</topic><topic>Image processing</topic><topic>Image segmentation</topic><topic>Mammography</topic><topic>Medical screening</topic><topic>Methods</topic><topic>Neural networks</topic><topic>Semantic segmentation</topic><topic>Semantics</topic><topic>Training</topic><topic>Transfer learning</topic><topic>Tumors</topic><topic>Ultrasonic imaging</topic><topic>Ultrasound</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Gómez-Flores, Wilfrido</creatorcontrib><creatorcontrib>Coelho de Albuquerque Pereira, Wagner</creatorcontrib><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer Science Database</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Biological Sciences</collection><collection>Computing Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Research Library</collection><collection>Biological Science Database</collection><collection>Biochemistry Abstracts 1</collection><collection>Research Library (Corporate)</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>Computers in biology and medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Gómez-Flores, Wilfrido</au><au>Coelho de Albuquerque Pereira, Wagner</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A comparative study of pre-trained convolutional neural networks for semantic segmentation of breast tumors in ultrasound</atitle><jtitle>Computers in biology and medicine</jtitle><addtitle>Comput Biol Med</addtitle><date>2020-11</date><risdate>2020</risdate><volume>126</volume><spage>104036</spage><epage>104036</epage><pages>104036-104036</pages><artnum>104036</artnum><issn>0010-4825</issn><eissn>1879-0534</eissn><abstract>The automatic segmentation of breast tumors in ultrasound (BUS) has recently been addressed using convolutional neural networks (CNN). These CNN-based approaches generally modify a previously proposed CNN architecture or they design a new architecture using CNN ensembles. Although these methods have reported satisfactory results, the trained CNN architectures are often unavailable for reproducibility purposes. Moreover, these methods commonly learn from small BUS datasets with particular properties, which limits generalization in new cases. This paper evaluates four public CNN-based semantic segmentation models that were developed by the computer vision community, as follows: (1) Fully Convolutional Network (FCN) with AlexNet network, (2) U-Net network, (3) SegNet using VGG16 and VGG19 networks, and (4) DeepLabV3+ using ResNet18, ResNet50, MobileNet-V2, and Xception networks. By transfer learning, these CNNs are fine-tuned to segment BUS images in normal and tumoral pixels. The goal is to select a potential CNN-based segmentation model to be further used in computer-aided diagnosis (CAD) systems. The main significance of this study is the comparison of eight well-established CNN architectures using a more extensive BUS dataset than those used by approaches that are currently found in the literature. More than 3000 BUS images acquired from seven US machine models are used for training and validation. The F1-score (F1s) and the Intersection over Union (IoU) quantify the segmentation performance. The segmentation models based on SegNet and DeepLabV3+ obtain the best results with F1s&gt;0.90 and IoU&gt;0.81. In the case of U-Net, the segmentation performance is F1s=0.89 and IoU=0.80, whereas FCN-AlexNet attains the lowest results with F1s=0.84 and IoU=0.73. In particular, ResNet18 obtains F1s=0.905 and IoU=0.827 and requires less training time among SegNet and DeepLabV3+ networks. Hence, ResNet18 is a potential candidate for implementing fully automated end-to-end CAD systems. The CNN models generated in this study are available to researchers at https://github.com/wgomezf/CNN-BUS-segment, which attempts to impact the fair comparison with other CNN-based segmentation approaches for BUS images. •Four deep semantic segmentation models for automatic segmentation of breast tumors in ultrasound are compared experimentally.•The impact of the type of convolutional neural network architecture on tumor segmentation quality is investigated.•Eight pre-trained convolutional neural networks are used as backbone models in deep semantic segmentation models.•The influence of specific properties of the breast ultrasound dataset on segmentation results is analyzed.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>33059238</pmid><doi>10.1016/j.compbiomed.2020.104036</doi><tpages>1</tpages></addata></record>
fulltext fulltext
identifier ISSN: 0010-4825
ispartof Computers in biology and medicine, 2020-11, Vol.126, p.104036-104036, Article 104036
issn 0010-4825
1879-0534
language eng
recordid cdi_proquest_miscellaneous_2451851257
source ScienceDirect Freedom Collection
subjects Architecture
Artificial neural networks
Automation
Breast cancer
Breast tumors
Breast ultrasound
Classification
Comparative studies
Computer vision
Convolutional neural networks
Datasets
Design modifications
Image acquisition
Image processing
Image segmentation
Mammography
Medical screening
Methods
Neural networks
Semantic segmentation
Semantics
Training
Transfer learning
Tumors
Ultrasonic imaging
Ultrasound
title A comparative study of pre-trained convolutional neural networks for semantic segmentation of breast tumors in ultrasound
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-03T16%3A05%3A57IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20comparative%20study%20of%20pre-trained%20convolutional%20neural%20networks%20for%20semantic%20segmentation%20of%20breast%20tumors%20in%20ultrasound&rft.jtitle=Computers%20in%20biology%20and%20medicine&rft.au=G%C3%B3mez-Flores,%20Wilfrido&rft.date=2020-11&rft.volume=126&rft.spage=104036&rft.epage=104036&rft.pages=104036-104036&rft.artnum=104036&rft.issn=0010-4825&rft.eissn=1879-0534&rft_id=info:doi/10.1016/j.compbiomed.2020.104036&rft_dat=%3Cproquest_cross%3E2451851257%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c402t-f602bb20d6cd5c7e6db38157459e2e3ed99ffbbcb18b3c2dd5b36fb70d3f66ae3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2454416012&rft_id=info:pmid/33059238&rfr_iscdi=true