Loading…

Attention-based deep learning for breast lesions classification on contrast enhanced spectral mammography: a multicentre study

Background This study aims to develop an attention-based deep learning model for distinguishing benign from malignant breast lesions on CESM. Methods Preoperative CESM images of 1239 patients, which were definitely diagnosed on pathology in a multicentre cohort, were divided into training and valida...

Full description

Saved in:
Bibliographic Details
Published in:British journal of cancer 2023-03, Vol.128 (5), p.793-804
Main Authors: Mao, Ning, Zhang, Haicheng, Dai, Yi, Li, Qin, Lin, Fan, Gao, Jing, Zheng, Tiantian, Zhao, Feng, Xie, Haizhu, Xu, Cong, Ma, Heng
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c430t-69717cf9e7169e66ded6b9b20cf412fe06a276135078cc017283119504c026f03
cites cdi_FETCH-LOGICAL-c430t-69717cf9e7169e66ded6b9b20cf412fe06a276135078cc017283119504c026f03
container_end_page 804
container_issue 5
container_start_page 793
container_title British journal of cancer
container_volume 128
creator Mao, Ning
Zhang, Haicheng
Dai, Yi
Li, Qin
Lin, Fan
Gao, Jing
Zheng, Tiantian
Zhao, Feng
Xie, Haizhu
Xu, Cong
Ma, Heng
description Background This study aims to develop an attention-based deep learning model for distinguishing benign from malignant breast lesions on CESM. Methods Preoperative CESM images of 1239 patients, which were definitely diagnosed on pathology in a multicentre cohort, were divided into training and validation sets, internal and external test sets. The regions of interest of the breast lesions were outlined manually by a senior radiologist. We adopted three conventional convolutional neural networks (CNNs), namely, DenseNet 121, Xception, and ResNet 50, as the backbone architectures and incorporated the convolutional block attention module (CBAM) into them for classification. The performance of the models was analysed in terms of the receiver operating characteristic (ROC) curve, accuracy, the positive predictive value (PPV), the negative predictive value (NPV), the F1 score, the precision recall curve (PRC), and heat maps. The final models were compared with the diagnostic performance of conventional CNNs, radiomics models, and two radiologists with specialised breast imaging experience. Results The best-performing deep learning model, that is, the CBAM-based Xception, achieved an area under the ROC curve (AUC) of 0.970, a sensitivity of 0.848, a specificity of 1.000, and an accuracy of 0.891 on the external test set, which was higher than those of other CNNs, radiomics models, and radiologists. The PRC and the heat maps also indicated the favourable predictive performance of the attention-based CNN model. The diagnostic performance of two radiologists improved with deep learning assistance. Conclusions Using an attention-based deep learning model based on CESM images can help to distinguishing benign from malignant breast lesions, and the diagnostic performance of radiologists improved with deep learning assistance.
doi_str_mv 10.1038/s41416-022-02092-y
format article
fullrecord <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9977865</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2755574470</sourcerecordid><originalsourceid>FETCH-LOGICAL-c430t-69717cf9e7169e66ded6b9b20cf412fe06a276135078cc017283119504c026f03</originalsourceid><addsrcrecordid>eNp9kc-L1TAQx4Mo7nP1H_AgAS9eqpO0TRoPwrL4Cxa86Dmk6fS9LG1Sk3ShF_928_at64-DkBAy85nvzPAl5DmD1wzq7k1qWMNEBZyXC4pX2wOyY23NK9Zx-ZDsAEBWJQFn5ElK1-WroJOPyVktWs4b2e3Ij4uc0WcXfNWbhAMdEBc6oYne-T0dQ6R9RJNyiaVCJWonk5IbnTXHKlqODT7HI4L-YLwtImlBW0ITnc08h300y2F7Sw2d1yk7W_pFpCmvw_aUPBrNlPDZ3XtOvn14__XyU3X15ePny4uryjY15EooyaQdFUomFAox4CB61XOwY8P4iCAMl4LVLcjOWmCSdzVjqoXGAhcj1Ofk3Ul3WfsZh9sRzKSX6GYTNx2M039nvDvofbjRSknZibYIvLoTiOH7iinr2SWL02Q8hjVpLtu2lU0jj71e_oNehzX6sl6huuKckJ0sFD9RNoaUIo73wzDQR3v1yV5d7NW39uqtFL34c437kl9-FqA-Aamk_B7j797_kf0JTJSzig</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2781036787</pqid></control><display><type>article</type><title>Attention-based deep learning for breast lesions classification on contrast enhanced spectral mammography: a multicentre study</title><source>Springer Link</source><source>PubMed Central</source><creator>Mao, Ning ; Zhang, Haicheng ; Dai, Yi ; Li, Qin ; Lin, Fan ; Gao, Jing ; Zheng, Tiantian ; Zhao, Feng ; Xie, Haizhu ; Xu, Cong ; Ma, Heng</creator><creatorcontrib>Mao, Ning ; Zhang, Haicheng ; Dai, Yi ; Li, Qin ; Lin, Fan ; Gao, Jing ; Zheng, Tiantian ; Zhao, Feng ; Xie, Haizhu ; Xu, Cong ; Ma, Heng</creatorcontrib><description>Background This study aims to develop an attention-based deep learning model for distinguishing benign from malignant breast lesions on CESM. Methods Preoperative CESM images of 1239 patients, which were definitely diagnosed on pathology in a multicentre cohort, were divided into training and validation sets, internal and external test sets. The regions of interest of the breast lesions were outlined manually by a senior radiologist. We adopted three conventional convolutional neural networks (CNNs), namely, DenseNet 121, Xception, and ResNet 50, as the backbone architectures and incorporated the convolutional block attention module (CBAM) into them for classification. The performance of the models was analysed in terms of the receiver operating characteristic (ROC) curve, accuracy, the positive predictive value (PPV), the negative predictive value (NPV), the F1 score, the precision recall curve (PRC), and heat maps. The final models were compared with the diagnostic performance of conventional CNNs, radiomics models, and two radiologists with specialised breast imaging experience. Results The best-performing deep learning model, that is, the CBAM-based Xception, achieved an area under the ROC curve (AUC) of 0.970, a sensitivity of 0.848, a specificity of 1.000, and an accuracy of 0.891 on the external test set, which was higher than those of other CNNs, radiomics models, and radiologists. The PRC and the heat maps also indicated the favourable predictive performance of the attention-based CNN model. The diagnostic performance of two radiologists improved with deep learning assistance. Conclusions Using an attention-based deep learning model based on CESM images can help to distinguishing benign from malignant breast lesions, and the diagnostic performance of radiologists improved with deep learning assistance.</description><identifier>ISSN: 0007-0920</identifier><identifier>EISSN: 1532-1827</identifier><identifier>DOI: 10.1038/s41416-022-02092-y</identifier><identifier>PMID: 36522478</identifier><language>eng</language><publisher>London: Nature Publishing Group UK</publisher><subject>692/4028/67/1347 ; 692/699/67/2321 ; Biomedical and Life Sciences ; Biomedicine ; Breast ; Breast - diagnostic imaging ; Breast Neoplasms - pathology ; Cancer Research ; Classification ; Deep Learning ; Drug Resistance ; Epidemiology ; Female ; Humans ; Lesions ; Mammography ; Mammography - methods ; Molecular Medicine ; Neural networks ; Neural Networks, Computer ; Oncology ; Radiomics ; Sensitivity and Specificity</subject><ispartof>British journal of cancer, 2023-03, Vol.128 (5), p.793-804</ispartof><rights>The Author(s), under exclusive licence to Springer Nature Limited 2022. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><rights>2022. The Author(s), under exclusive licence to Springer Nature Limited.</rights><rights>The Author(s), under exclusive licence to Springer Nature Limited 2022, Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c430t-69717cf9e7169e66ded6b9b20cf412fe06a276135078cc017283119504c026f03</citedby><cites>FETCH-LOGICAL-c430t-69717cf9e7169e66ded6b9b20cf412fe06a276135078cc017283119504c026f03</cites><orcidid>0000-0003-1116-3369</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9977865/pdf/$$EPDF$$P50$$Gpubmedcentral$$H</linktopdf><linktohtml>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC9977865/$$EHTML$$P50$$Gpubmedcentral$$H</linktohtml><link.rule.ids>230,314,727,780,784,885,27924,27925,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36522478$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Mao, Ning</creatorcontrib><creatorcontrib>Zhang, Haicheng</creatorcontrib><creatorcontrib>Dai, Yi</creatorcontrib><creatorcontrib>Li, Qin</creatorcontrib><creatorcontrib>Lin, Fan</creatorcontrib><creatorcontrib>Gao, Jing</creatorcontrib><creatorcontrib>Zheng, Tiantian</creatorcontrib><creatorcontrib>Zhao, Feng</creatorcontrib><creatorcontrib>Xie, Haizhu</creatorcontrib><creatorcontrib>Xu, Cong</creatorcontrib><creatorcontrib>Ma, Heng</creatorcontrib><title>Attention-based deep learning for breast lesions classification on contrast enhanced spectral mammography: a multicentre study</title><title>British journal of cancer</title><addtitle>Br J Cancer</addtitle><addtitle>Br J Cancer</addtitle><description>Background This study aims to develop an attention-based deep learning model for distinguishing benign from malignant breast lesions on CESM. Methods Preoperative CESM images of 1239 patients, which were definitely diagnosed on pathology in a multicentre cohort, were divided into training and validation sets, internal and external test sets. The regions of interest of the breast lesions were outlined manually by a senior radiologist. We adopted three conventional convolutional neural networks (CNNs), namely, DenseNet 121, Xception, and ResNet 50, as the backbone architectures and incorporated the convolutional block attention module (CBAM) into them for classification. The performance of the models was analysed in terms of the receiver operating characteristic (ROC) curve, accuracy, the positive predictive value (PPV), the negative predictive value (NPV), the F1 score, the precision recall curve (PRC), and heat maps. The final models were compared with the diagnostic performance of conventional CNNs, radiomics models, and two radiologists with specialised breast imaging experience. Results The best-performing deep learning model, that is, the CBAM-based Xception, achieved an area under the ROC curve (AUC) of 0.970, a sensitivity of 0.848, a specificity of 1.000, and an accuracy of 0.891 on the external test set, which was higher than those of other CNNs, radiomics models, and radiologists. The PRC and the heat maps also indicated the favourable predictive performance of the attention-based CNN model. The diagnostic performance of two radiologists improved with deep learning assistance. Conclusions Using an attention-based deep learning model based on CESM images can help to distinguishing benign from malignant breast lesions, and the diagnostic performance of radiologists improved with deep learning assistance.</description><subject>692/4028/67/1347</subject><subject>692/699/67/2321</subject><subject>Biomedical and Life Sciences</subject><subject>Biomedicine</subject><subject>Breast</subject><subject>Breast - diagnostic imaging</subject><subject>Breast Neoplasms - pathology</subject><subject>Cancer Research</subject><subject>Classification</subject><subject>Deep Learning</subject><subject>Drug Resistance</subject><subject>Epidemiology</subject><subject>Female</subject><subject>Humans</subject><subject>Lesions</subject><subject>Mammography</subject><subject>Mammography - methods</subject><subject>Molecular Medicine</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Oncology</subject><subject>Radiomics</subject><subject>Sensitivity and Specificity</subject><issn>0007-0920</issn><issn>1532-1827</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNp9kc-L1TAQx4Mo7nP1H_AgAS9eqpO0TRoPwrL4Cxa86Dmk6fS9LG1Sk3ShF_928_at64-DkBAy85nvzPAl5DmD1wzq7k1qWMNEBZyXC4pX2wOyY23NK9Zx-ZDsAEBWJQFn5ElK1-WroJOPyVktWs4b2e3Ij4uc0WcXfNWbhAMdEBc6oYne-T0dQ6R9RJNyiaVCJWonk5IbnTXHKlqODT7HI4L-YLwtImlBW0ITnc08h300y2F7Sw2d1yk7W_pFpCmvw_aUPBrNlPDZ3XtOvn14__XyU3X15ePny4uryjY15EooyaQdFUomFAox4CB61XOwY8P4iCAMl4LVLcjOWmCSdzVjqoXGAhcj1Ofk3Ul3WfsZh9sRzKSX6GYTNx2M039nvDvofbjRSknZibYIvLoTiOH7iinr2SWL02Q8hjVpLtu2lU0jj71e_oNehzX6sl6huuKckJ0sFD9RNoaUIo73wzDQR3v1yV5d7NW39uqtFL34c437kl9-FqA-Aamk_B7j797_kf0JTJSzig</recordid><startdate>20230323</startdate><enddate>20230323</enddate><creator>Mao, Ning</creator><creator>Zhang, Haicheng</creator><creator>Dai, Yi</creator><creator>Li, Qin</creator><creator>Lin, Fan</creator><creator>Gao, Jing</creator><creator>Zheng, Tiantian</creator><creator>Zhao, Feng</creator><creator>Xie, Haizhu</creator><creator>Xu, Cong</creator><creator>Ma, Heng</creator><general>Nature Publishing Group UK</general><general>Nature Publishing Group</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7TO</scope><scope>7U9</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AN0</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB0</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M7P</scope><scope>NAPCQ</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><orcidid>https://orcid.org/0000-0003-1116-3369</orcidid></search><sort><creationdate>20230323</creationdate><title>Attention-based deep learning for breast lesions classification on contrast enhanced spectral mammography: a multicentre study</title><author>Mao, Ning ; Zhang, Haicheng ; Dai, Yi ; Li, Qin ; Lin, Fan ; Gao, Jing ; Zheng, Tiantian ; Zhao, Feng ; Xie, Haizhu ; Xu, Cong ; Ma, Heng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c430t-69717cf9e7169e66ded6b9b20cf412fe06a276135078cc017283119504c026f03</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>692/4028/67/1347</topic><topic>692/699/67/2321</topic><topic>Biomedical and Life Sciences</topic><topic>Biomedicine</topic><topic>Breast</topic><topic>Breast - diagnostic imaging</topic><topic>Breast Neoplasms - pathology</topic><topic>Cancer Research</topic><topic>Classification</topic><topic>Deep Learning</topic><topic>Drug Resistance</topic><topic>Epidemiology</topic><topic>Female</topic><topic>Humans</topic><topic>Lesions</topic><topic>Mammography</topic><topic>Mammography - methods</topic><topic>Molecular Medicine</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Oncology</topic><topic>Radiomics</topic><topic>Sensitivity and Specificity</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Mao, Ning</creatorcontrib><creatorcontrib>Zhang, Haicheng</creatorcontrib><creatorcontrib>Dai, Yi</creatorcontrib><creatorcontrib>Li, Qin</creatorcontrib><creatorcontrib>Lin, Fan</creatorcontrib><creatorcontrib>Gao, Jing</creatorcontrib><creatorcontrib>Zheng, Tiantian</creatorcontrib><creatorcontrib>Zhao, Feng</creatorcontrib><creatorcontrib>Xie, Haizhu</creatorcontrib><creatorcontrib>Xu, Cong</creatorcontrib><creatorcontrib>Ma, Heng</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>ProQuest Nursing &amp; Allied Health Database</collection><collection>Oncogenes and Growth Factors Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>ProQuest Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>British Nursing Database</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>ProQuest Biological Science Collection</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Biological Science Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><jtitle>British journal of cancer</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Mao, Ning</au><au>Zhang, Haicheng</au><au>Dai, Yi</au><au>Li, Qin</au><au>Lin, Fan</au><au>Gao, Jing</au><au>Zheng, Tiantian</au><au>Zhao, Feng</au><au>Xie, Haizhu</au><au>Xu, Cong</au><au>Ma, Heng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Attention-based deep learning for breast lesions classification on contrast enhanced spectral mammography: a multicentre study</atitle><jtitle>British journal of cancer</jtitle><stitle>Br J Cancer</stitle><addtitle>Br J Cancer</addtitle><date>2023-03-23</date><risdate>2023</risdate><volume>128</volume><issue>5</issue><spage>793</spage><epage>804</epage><pages>793-804</pages><issn>0007-0920</issn><eissn>1532-1827</eissn><abstract>Background This study aims to develop an attention-based deep learning model for distinguishing benign from malignant breast lesions on CESM. Methods Preoperative CESM images of 1239 patients, which were definitely diagnosed on pathology in a multicentre cohort, were divided into training and validation sets, internal and external test sets. The regions of interest of the breast lesions were outlined manually by a senior radiologist. We adopted three conventional convolutional neural networks (CNNs), namely, DenseNet 121, Xception, and ResNet 50, as the backbone architectures and incorporated the convolutional block attention module (CBAM) into them for classification. The performance of the models was analysed in terms of the receiver operating characteristic (ROC) curve, accuracy, the positive predictive value (PPV), the negative predictive value (NPV), the F1 score, the precision recall curve (PRC), and heat maps. The final models were compared with the diagnostic performance of conventional CNNs, radiomics models, and two radiologists with specialised breast imaging experience. Results The best-performing deep learning model, that is, the CBAM-based Xception, achieved an area under the ROC curve (AUC) of 0.970, a sensitivity of 0.848, a specificity of 1.000, and an accuracy of 0.891 on the external test set, which was higher than those of other CNNs, radiomics models, and radiologists. The PRC and the heat maps also indicated the favourable predictive performance of the attention-based CNN model. The diagnostic performance of two radiologists improved with deep learning assistance. Conclusions Using an attention-based deep learning model based on CESM images can help to distinguishing benign from malignant breast lesions, and the diagnostic performance of radiologists improved with deep learning assistance.</abstract><cop>London</cop><pub>Nature Publishing Group UK</pub><pmid>36522478</pmid><doi>10.1038/s41416-022-02092-y</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0003-1116-3369</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 0007-0920
ispartof British journal of cancer, 2023-03, Vol.128 (5), p.793-804
issn 0007-0920
1532-1827
language eng
recordid cdi_pubmedcentral_primary_oai_pubmedcentral_nih_gov_9977865
source Springer Link; PubMed Central
subjects 692/4028/67/1347
692/699/67/2321
Biomedical and Life Sciences
Biomedicine
Breast
Breast - diagnostic imaging
Breast Neoplasms - pathology
Cancer Research
Classification
Deep Learning
Drug Resistance
Epidemiology
Female
Humans
Lesions
Mammography
Mammography - methods
Molecular Medicine
Neural networks
Neural Networks, Computer
Oncology
Radiomics
Sensitivity and Specificity
title Attention-based deep learning for breast lesions classification on contrast enhanced spectral mammography: a multicentre study
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T01%3A22%3A17IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Attention-based%20deep%20learning%20for%20breast%20lesions%20classification%20on%20contrast%20enhanced%20spectral%20mammography:%20a%20multicentre%20study&rft.jtitle=British%20journal%20of%20cancer&rft.au=Mao,%20Ning&rft.date=2023-03-23&rft.volume=128&rft.issue=5&rft.spage=793&rft.epage=804&rft.pages=793-804&rft.issn=0007-0920&rft.eissn=1532-1827&rft_id=info:doi/10.1038/s41416-022-02092-y&rft_dat=%3Cproquest_pubme%3E2755574470%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c430t-69717cf9e7169e66ded6b9b20cf412fe06a276135078cc017283119504c026f03%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2781036787&rft_id=info:pmid/36522478&rfr_iscdi=true