Loading…
MAD-Net: Multi-attention dense network for functional bone marrow segmentation
Radiotherapy is the main treatment modality for various pelvic malignancies. However, high intensity radiation can damage the functional bone marrow (FBM), resulting in hematological toxicity (HT). Accurate identification and protection of the FBM during radiotherapy planning can reduce pelvic HT. T...
Saved in:
Published in: | Computers in biology and medicine 2023-03, Vol.154, p.106428, Article 106428 |
---|---|
Main Authors: | , , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c452t-435a0c1313ce9cf44bfb710a8be4fbef5cb214e6f52b0cdbb9cdc7ea6888dbcd3 |
---|---|
cites | cdi_FETCH-LOGICAL-c452t-435a0c1313ce9cf44bfb710a8be4fbef5cb214e6f52b0cdbb9cdc7ea6888dbcd3 |
container_end_page | |
container_issue | |
container_start_page | 106428 |
container_title | Computers in biology and medicine |
container_volume | 154 |
creator | Qin, Chuanbo Zheng, Bin Li, Wanying Chen, Hongbo Zeng, Junying Wu, Chenwang Liang, Shufen Luo, Jun Zhou, Shuquan Xiao, Lin |
description | Radiotherapy is the main treatment modality for various pelvic malignancies. However, high intensity radiation can damage the functional bone marrow (FBM), resulting in hematological toxicity (HT). Accurate identification and protection of the FBM during radiotherapy planning can reduce pelvic HT. The traditional manual method for contouring the FBM is time-consuming and laborious. Therefore, development of an efficient and accurate automatic segmentation mode can provide a distinct leverage in clinical settings. In this paper, we propose the first network for performing the FBM segmentation task, which is referred to as the multi-attention dense network (named MAD-Net). Primarily, we introduce the dense convolution block to promote the gradient flow in the network as well as incite feature reuse. Next, a novel slide-window attention module is proposed to emphasize long-range dependencies and exploit interdependencies between features. Finally, we design a residual-dual attention module as the bottleneck layer, which further aggregates useful spatial details and explores intra-class responsiveness of high-level features. In this work, we conduct extensive experiments on our dataset of 3838 two-dimensional pelvic slices. Experimental results demonstrate that the proposed MAD-Net transcends previous state-of-the-art models in various metrics. In addition, the contributions of the proposed components are verified by ablation analysis, and we conduct experiments on three other datasets to manifest the generalizability of MAD-Net.
•We propose MAD-Net, the first model to segment the functional bone marrow from pelvic CT images.•The slide-window attention module is designed to explore interdependencies between features.•The residual-dual attention module is employed to enhance high-level feature representations.•MAD-Net surpasses to state-of-the-art models in segmentation accuracy, while retaining comparable training and inference times. |
doi_str_mv | 10.1016/j.compbiomed.2022.106428 |
format | article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2776530771</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0010482522011362</els_id><sourcerecordid>2776530771</sourcerecordid><originalsourceid>FETCH-LOGICAL-c452t-435a0c1313ce9cf44bfb710a8be4fbef5cb214e6f52b0cdbb9cdc7ea6888dbcd3</originalsourceid><addsrcrecordid>eNqFkMtOwzAQRS0EoqXwC8gS65Sx4yQuu1KeUls2sLZiZ4xS2rjYDhV_T6JSsWQ10tx753EIoQzGDFh-vRobt9nq2m2wGnPgvGvngssjMmSymCSQpeKYDAEYJELybEDOQlgBgIAUTskgzXPJWSGHZLmY3iVLjDd00a5jnZQxYhNr19AKm4C0wbhz_oNa56ltG9NL5Zpq1yDdlN67HQ34vukyZS-dkxNbrgNe_NYReXu4f509JfOXx-fZdJ4YkfGYiDQrwbCUpQYnxgqhrS4YlFKjsBptZjRnAnObcQ2m0npiKlNgmUspK22qdESu9nO33n22GKJaudZ3lwXFiyLPUigK1rnk3mW8C8GjVVtfd1d_KwaqB6lW6g-k6kGqPcguevm7oNW9dggeyHWG270Buze_avQqmBobg1Xt0URVufr_LT_Tg4rq</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2776530771</pqid></control><display><type>article</type><title>MAD-Net: Multi-attention dense network for functional bone marrow segmentation</title><source>ScienceDirect Freedom Collection</source><creator>Qin, Chuanbo ; Zheng, Bin ; Li, Wanying ; Chen, Hongbo ; Zeng, Junying ; Wu, Chenwang ; Liang, Shufen ; Luo, Jun ; Zhou, Shuquan ; Xiao, Lin</creator><creatorcontrib>Qin, Chuanbo ; Zheng, Bin ; Li, Wanying ; Chen, Hongbo ; Zeng, Junying ; Wu, Chenwang ; Liang, Shufen ; Luo, Jun ; Zhou, Shuquan ; Xiao, Lin</creatorcontrib><description>Radiotherapy is the main treatment modality for various pelvic malignancies. However, high intensity radiation can damage the functional bone marrow (FBM), resulting in hematological toxicity (HT). Accurate identification and protection of the FBM during radiotherapy planning can reduce pelvic HT. The traditional manual method for contouring the FBM is time-consuming and laborious. Therefore, development of an efficient and accurate automatic segmentation mode can provide a distinct leverage in clinical settings. In this paper, we propose the first network for performing the FBM segmentation task, which is referred to as the multi-attention dense network (named MAD-Net). Primarily, we introduce the dense convolution block to promote the gradient flow in the network as well as incite feature reuse. Next, a novel slide-window attention module is proposed to emphasize long-range dependencies and exploit interdependencies between features. Finally, we design a residual-dual attention module as the bottleneck layer, which further aggregates useful spatial details and explores intra-class responsiveness of high-level features. In this work, we conduct extensive experiments on our dataset of 3838 two-dimensional pelvic slices. Experimental results demonstrate that the proposed MAD-Net transcends previous state-of-the-art models in various metrics. In addition, the contributions of the proposed components are verified by ablation analysis, and we conduct experiments on three other datasets to manifest the generalizability of MAD-Net.
•We propose MAD-Net, the first model to segment the functional bone marrow from pelvic CT images.•The slide-window attention module is designed to explore interdependencies between features.•The residual-dual attention module is employed to enhance high-level feature representations.•MAD-Net surpasses to state-of-the-art models in segmentation accuracy, while retaining comparable training and inference times.</description><identifier>ISSN: 0010-4825</identifier><identifier>EISSN: 1879-0534</identifier><identifier>DOI: 10.1016/j.compbiomed.2022.106428</identifier><identifier>PMID: 36682178</identifier><language>eng</language><publisher>United States: Elsevier Ltd</publisher><subject>Ablation ; Benchmarking ; Bone marrow ; Bone Marrow - diagnostic imaging ; Cancer therapies ; Cervical cancer ; Chemotherapy ; Contouring ; Datasets ; Deep learning ; Dense convolution ; Female ; Functional bone marrow segmentation ; Gradient flow ; Humans ; Image Processing, Computer-Assisted ; Labor, Obstetric ; Magnetic resonance imaging ; Malignancy ; Medical imaging ; Methods ; Modules ; Pelvis ; Pregnancy ; Radiation ; Radiation damage ; Radiation therapy ; Residual-dual attention ; Segmentation ; Semantics ; Slide-window attention ; Toxicity</subject><ispartof>Computers in biology and medicine, 2023-03, Vol.154, p.106428, Article 106428</ispartof><rights>2022 The Author(s)</rights><rights>Copyright © 2022 The Author(s). Published by Elsevier Ltd.. All rights reserved.</rights><rights>2022. The Author(s)</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c452t-435a0c1313ce9cf44bfb710a8be4fbef5cb214e6f52b0cdbb9cdc7ea6888dbcd3</citedby><cites>FETCH-LOGICAL-c452t-435a0c1313ce9cf44bfb710a8be4fbef5cb214e6f52b0cdbb9cdc7ea6888dbcd3</cites><orcidid>0000-0001-7189-5196 ; 0000-0002-9420-093X</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36682178$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Qin, Chuanbo</creatorcontrib><creatorcontrib>Zheng, Bin</creatorcontrib><creatorcontrib>Li, Wanying</creatorcontrib><creatorcontrib>Chen, Hongbo</creatorcontrib><creatorcontrib>Zeng, Junying</creatorcontrib><creatorcontrib>Wu, Chenwang</creatorcontrib><creatorcontrib>Liang, Shufen</creatorcontrib><creatorcontrib>Luo, Jun</creatorcontrib><creatorcontrib>Zhou, Shuquan</creatorcontrib><creatorcontrib>Xiao, Lin</creatorcontrib><title>MAD-Net: Multi-attention dense network for functional bone marrow segmentation</title><title>Computers in biology and medicine</title><addtitle>Comput Biol Med</addtitle><description>Radiotherapy is the main treatment modality for various pelvic malignancies. However, high intensity radiation can damage the functional bone marrow (FBM), resulting in hematological toxicity (HT). Accurate identification and protection of the FBM during radiotherapy planning can reduce pelvic HT. The traditional manual method for contouring the FBM is time-consuming and laborious. Therefore, development of an efficient and accurate automatic segmentation mode can provide a distinct leverage in clinical settings. In this paper, we propose the first network for performing the FBM segmentation task, which is referred to as the multi-attention dense network (named MAD-Net). Primarily, we introduce the dense convolution block to promote the gradient flow in the network as well as incite feature reuse. Next, a novel slide-window attention module is proposed to emphasize long-range dependencies and exploit interdependencies between features. Finally, we design a residual-dual attention module as the bottleneck layer, which further aggregates useful spatial details and explores intra-class responsiveness of high-level features. In this work, we conduct extensive experiments on our dataset of 3838 two-dimensional pelvic slices. Experimental results demonstrate that the proposed MAD-Net transcends previous state-of-the-art models in various metrics. In addition, the contributions of the proposed components are verified by ablation analysis, and we conduct experiments on three other datasets to manifest the generalizability of MAD-Net.
•We propose MAD-Net, the first model to segment the functional bone marrow from pelvic CT images.•The slide-window attention module is designed to explore interdependencies between features.•The residual-dual attention module is employed to enhance high-level feature representations.•MAD-Net surpasses to state-of-the-art models in segmentation accuracy, while retaining comparable training and inference times.</description><subject>Ablation</subject><subject>Benchmarking</subject><subject>Bone marrow</subject><subject>Bone Marrow - diagnostic imaging</subject><subject>Cancer therapies</subject><subject>Cervical cancer</subject><subject>Chemotherapy</subject><subject>Contouring</subject><subject>Datasets</subject><subject>Deep learning</subject><subject>Dense convolution</subject><subject>Female</subject><subject>Functional bone marrow segmentation</subject><subject>Gradient flow</subject><subject>Humans</subject><subject>Image Processing, Computer-Assisted</subject><subject>Labor, Obstetric</subject><subject>Magnetic resonance imaging</subject><subject>Malignancy</subject><subject>Medical imaging</subject><subject>Methods</subject><subject>Modules</subject><subject>Pelvis</subject><subject>Pregnancy</subject><subject>Radiation</subject><subject>Radiation damage</subject><subject>Radiation therapy</subject><subject>Residual-dual attention</subject><subject>Segmentation</subject><subject>Semantics</subject><subject>Slide-window attention</subject><subject>Toxicity</subject><issn>0010-4825</issn><issn>1879-0534</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><recordid>eNqFkMtOwzAQRS0EoqXwC8gS65Sx4yQuu1KeUls2sLZiZ4xS2rjYDhV_T6JSsWQ10tx753EIoQzGDFh-vRobt9nq2m2wGnPgvGvngssjMmSymCSQpeKYDAEYJELybEDOQlgBgIAUTskgzXPJWSGHZLmY3iVLjDd00a5jnZQxYhNr19AKm4C0wbhz_oNa56ltG9NL5Zpq1yDdlN67HQ34vukyZS-dkxNbrgNe_NYReXu4f509JfOXx-fZdJ4YkfGYiDQrwbCUpQYnxgqhrS4YlFKjsBptZjRnAnObcQ2m0npiKlNgmUspK22qdESu9nO33n22GKJaudZ3lwXFiyLPUigK1rnk3mW8C8GjVVtfd1d_KwaqB6lW6g-k6kGqPcguevm7oNW9dggeyHWG270Buze_avQqmBobg1Xt0URVufr_LT_Tg4rq</recordid><startdate>202303</startdate><enddate>202303</enddate><creator>Qin, Chuanbo</creator><creator>Zheng, Bin</creator><creator>Li, Wanying</creator><creator>Chen, Hongbo</creator><creator>Zeng, Junying</creator><creator>Wu, Chenwang</creator><creator>Liang, Shufen</creator><creator>Luo, Jun</creator><creator>Zhou, Shuquan</creator><creator>Xiao, Lin</creator><general>Elsevier Ltd</general><general>Elsevier Limited</general><scope>6I.</scope><scope>AAFTH</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7RV</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AL</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>8G5</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>GUQSH</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K7-</scope><scope>K9.</scope><scope>KB0</scope><scope>LK8</scope><scope>M0N</scope><scope>M0S</scope><scope>M1P</scope><scope>M2O</scope><scope>M7P</scope><scope>M7Z</scope><scope>MBDVC</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0001-7189-5196</orcidid><orcidid>https://orcid.org/0000-0002-9420-093X</orcidid></search><sort><creationdate>202303</creationdate><title>MAD-Net: Multi-attention dense network for functional bone marrow segmentation</title><author>Qin, Chuanbo ; Zheng, Bin ; Li, Wanying ; Chen, Hongbo ; Zeng, Junying ; Wu, Chenwang ; Liang, Shufen ; Luo, Jun ; Zhou, Shuquan ; Xiao, Lin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c452t-435a0c1313ce9cf44bfb710a8be4fbef5cb214e6f52b0cdbb9cdc7ea6888dbcd3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Ablation</topic><topic>Benchmarking</topic><topic>Bone marrow</topic><topic>Bone Marrow - diagnostic imaging</topic><topic>Cancer therapies</topic><topic>Cervical cancer</topic><topic>Chemotherapy</topic><topic>Contouring</topic><topic>Datasets</topic><topic>Deep learning</topic><topic>Dense convolution</topic><topic>Female</topic><topic>Functional bone marrow segmentation</topic><topic>Gradient flow</topic><topic>Humans</topic><topic>Image Processing, Computer-Assisted</topic><topic>Labor, Obstetric</topic><topic>Magnetic resonance imaging</topic><topic>Malignancy</topic><topic>Medical imaging</topic><topic>Methods</topic><topic>Modules</topic><topic>Pelvis</topic><topic>Pregnancy</topic><topic>Radiation</topic><topic>Radiation damage</topic><topic>Radiation therapy</topic><topic>Residual-dual attention</topic><topic>Segmentation</topic><topic>Semantics</topic><topic>Slide-window attention</topic><topic>Toxicity</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Qin, Chuanbo</creatorcontrib><creatorcontrib>Zheng, Bin</creatorcontrib><creatorcontrib>Li, Wanying</creatorcontrib><creatorcontrib>Chen, Hongbo</creatorcontrib><creatorcontrib>Zeng, Junying</creatorcontrib><creatorcontrib>Wu, Chenwang</creatorcontrib><creatorcontrib>Liang, Shufen</creatorcontrib><creatorcontrib>Luo, Jun</creatorcontrib><creatorcontrib>Zhou, Shuquan</creatorcontrib><creatorcontrib>Xiao, Lin</creatorcontrib><collection>ScienceDirect Open Access Titles</collection><collection>Elsevier:ScienceDirect:Open Access</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Nursing & Allied Health Database</collection><collection>ProQuest Health and Medical</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Research Library (Alumni Edition)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies & Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Databases</collection><collection>Technology Collection</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>Research Library Prep</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>Computer science database</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Nursing & Allied Health Database (Alumni Edition)</collection><collection>ProQuest Biological Science Collection</collection><collection>Computing Database</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>ProQuest research library</collection><collection>ProQuest Biological Science Journals</collection><collection>Biochemistry Abstracts 1</collection><collection>Research Library (Corporate)</collection><collection>Nursing & Allied Health Premium</collection><collection>ProQuest advanced technologies & aerospace journals</collection><collection>ProQuest Advanced Technologies & Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><jtitle>Computers in biology and medicine</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Qin, Chuanbo</au><au>Zheng, Bin</au><au>Li, Wanying</au><au>Chen, Hongbo</au><au>Zeng, Junying</au><au>Wu, Chenwang</au><au>Liang, Shufen</au><au>Luo, Jun</au><au>Zhou, Shuquan</au><au>Xiao, Lin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>MAD-Net: Multi-attention dense network for functional bone marrow segmentation</atitle><jtitle>Computers in biology and medicine</jtitle><addtitle>Comput Biol Med</addtitle><date>2023-03</date><risdate>2023</risdate><volume>154</volume><spage>106428</spage><pages>106428-</pages><artnum>106428</artnum><issn>0010-4825</issn><eissn>1879-0534</eissn><abstract>Radiotherapy is the main treatment modality for various pelvic malignancies. However, high intensity radiation can damage the functional bone marrow (FBM), resulting in hematological toxicity (HT). Accurate identification and protection of the FBM during radiotherapy planning can reduce pelvic HT. The traditional manual method for contouring the FBM is time-consuming and laborious. Therefore, development of an efficient and accurate automatic segmentation mode can provide a distinct leverage in clinical settings. In this paper, we propose the first network for performing the FBM segmentation task, which is referred to as the multi-attention dense network (named MAD-Net). Primarily, we introduce the dense convolution block to promote the gradient flow in the network as well as incite feature reuse. Next, a novel slide-window attention module is proposed to emphasize long-range dependencies and exploit interdependencies between features. Finally, we design a residual-dual attention module as the bottleneck layer, which further aggregates useful spatial details and explores intra-class responsiveness of high-level features. In this work, we conduct extensive experiments on our dataset of 3838 two-dimensional pelvic slices. Experimental results demonstrate that the proposed MAD-Net transcends previous state-of-the-art models in various metrics. In addition, the contributions of the proposed components are verified by ablation analysis, and we conduct experiments on three other datasets to manifest the generalizability of MAD-Net.
•We propose MAD-Net, the first model to segment the functional bone marrow from pelvic CT images.•The slide-window attention module is designed to explore interdependencies between features.•The residual-dual attention module is employed to enhance high-level feature representations.•MAD-Net surpasses to state-of-the-art models in segmentation accuracy, while retaining comparable training and inference times.</abstract><cop>United States</cop><pub>Elsevier Ltd</pub><pmid>36682178</pmid><doi>10.1016/j.compbiomed.2022.106428</doi><orcidid>https://orcid.org/0000-0001-7189-5196</orcidid><orcidid>https://orcid.org/0000-0002-9420-093X</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0010-4825 |
ispartof | Computers in biology and medicine, 2023-03, Vol.154, p.106428, Article 106428 |
issn | 0010-4825 1879-0534 |
language | eng |
recordid | cdi_proquest_journals_2776530771 |
source | ScienceDirect Freedom Collection |
subjects | Ablation Benchmarking Bone marrow Bone Marrow - diagnostic imaging Cancer therapies Cervical cancer Chemotherapy Contouring Datasets Deep learning Dense convolution Female Functional bone marrow segmentation Gradient flow Humans Image Processing, Computer-Assisted Labor, Obstetric Magnetic resonance imaging Malignancy Medical imaging Methods Modules Pelvis Pregnancy Radiation Radiation damage Radiation therapy Residual-dual attention Segmentation Semantics Slide-window attention Toxicity |
title | MAD-Net: Multi-attention dense network for functional bone marrow segmentation |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T19%3A45%3A26IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=MAD-Net:%20Multi-attention%20dense%20network%20for%20functional%20bone%20marrow%20segmentation&rft.jtitle=Computers%20in%20biology%20and%20medicine&rft.au=Qin,%20Chuanbo&rft.date=2023-03&rft.volume=154&rft.spage=106428&rft.pages=106428-&rft.artnum=106428&rft.issn=0010-4825&rft.eissn=1879-0534&rft_id=info:doi/10.1016/j.compbiomed.2022.106428&rft_dat=%3Cproquest_cross%3E2776530771%3C/proquest_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c452t-435a0c1313ce9cf44bfb710a8be4fbef5cb214e6f52b0cdbb9cdc7ea6888dbcd3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2776530771&rft_id=info:pmid/36682178&rfr_iscdi=true |