Loading…
Multi-organ auto-delineation in head-and-neck MRI for radiation therapy using regional convolutional neural network
Magnetic resonance imaging (MRI) allows accurate and reliable organ delineation for many disease sites in radiation therapy because MRI is able to offer superb soft-tissue contrast. Manual organ-at-risk delineation is labor-intensive and time-consuming. This study aims to develop a deep-learning-bas...
Saved in:
Published in: | Physics in medicine & biology 2022-01, Vol.67 (2), p.25006 |
---|---|
Main Authors: | , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c476t-d0e4e2a91ca38cef9e1fe7838a7519b4e4b3e7066030eee2a86fa72971b1e2d13 |
---|---|
cites | cdi_FETCH-LOGICAL-c476t-d0e4e2a91ca38cef9e1fe7838a7519b4e4b3e7066030eee2a86fa72971b1e2d13 |
container_end_page | |
container_issue | 2 |
container_start_page | 25006 |
container_title | Physics in medicine & biology |
container_volume | 67 |
creator | Dai, Xianjin Lei, Yang Wang, Tonghe Zhou, Jun Rudra, Soumon McDonald, Mark Curran, Walter J Liu, Tian Yang, Xiaofeng |
description | Magnetic resonance imaging (MRI) allows accurate and reliable organ delineation for many disease sites in radiation therapy because MRI is able to offer superb soft-tissue contrast. Manual organ-at-risk delineation is labor-intensive and time-consuming. This study aims to develop a deep-learning-based automated multi-organ segmentation method to release the labor and accelerate the treatment planning process for head-and-neck (HN) cancer radiotherapy. A novel regional convolutional neural network (R-CNN) architecture, namely, mask scoring R-CNN, has been developed in this study. In the proposed model, a deep attention feature pyramid network is used as a backbone to extract the coarse features given by MRI, followed by feature refinement using R-CNN. The final segmentation is obtained through mask and mask scoring networks taking those refined feature maps as input. With the mask scoring mechanism incorporated into conventional mask supervision, the classification error can be highly minimized in conventional mask R-CNN architecture. A cohort of 60 HN cancer patients receiving external beam radiation therapy was used for experimental validation. Five-fold cross-validation was performed for the assessment of our proposed method. The Dice similarity coefficients of brain stem, left/right cochlea, left/right eye, larynx, left/right lens, mandible, optic chiasm, left/right optic nerve, oral cavity, left/right parotid, pharynx, and spinal cord were 0.89 ± 0.06, 0.68 ± 0.14/0.68 ± 0.18, 0.89 ± 0.07/0.89 ± 0.05, 0.90 ± 0.07, 0.67 ± 0.18/0.67 ± 0.10, 0.82 ± 0.10, 0.61 ± 0.14, 0.67 ± 0.11/0.68 ± 0.11, 0.92 ± 0.07, 0.85 ± 0.06/0.86 ± 0.05, 0.80 ± 0.13, and 0.77 ± 0.15, respectively. After the model training, all OARs can be segmented within 1 min. |
doi_str_mv | 10.1088/1361-6560/ac3b34 |
format | article |
fullrecord | <record><control><sourceid>proquest_pubme</sourceid><recordid>TN_cdi_proquest_miscellaneous_2599181106</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2599181106</sourcerecordid><originalsourceid>FETCH-LOGICAL-c476t-d0e4e2a91ca38cef9e1fe7838a7519b4e4b3e7066030eee2a86fa72971b1e2d13</originalsourceid><addsrcrecordid>eNp9kE1P3TAQRS1EBa-0e1bIO7qoy0ycOM4SoX4ggSpV7dpyksnDkGcHO27Fv29eQ1lVXV3N6Ny7OIydInxA0PoCpUKhKgUXtpOtLA_Y5uV1yDYAEkWDVXXMXqd0D4Coi_KIHcuybkqUesPSbR5nJ0LcWs9tnoPoaXSe7OyC587zO7K9sL4XnroHfvvtmg8h8mh7tyLzHUU7PfGcnN_ySNvlaUfeBf8zjHleL085_on5V4gPb9irwY6J3j7nCfvx6eP3qy_i5uvn66vLG9GVtZpFD1RSYRvsrNQdDQ3hQLWW2tYVNm1JZSupBqVAAtFCajXYumhqbJGKHuUJe7fuTjE8Zkqz2bnU0ThaTyEnU1RNgxoR1ILCinYxpBRpMFN0OxufDILZqzZ7r2bv1ayql8rZ83pud9S_FP66XYD3K-DCZO5DjouJ9L-983_g0641qjaFgaICUGbqB_kbFC6W4Q</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2599181106</pqid></control><display><type>article</type><title>Multi-organ auto-delineation in head-and-neck MRI for radiation therapy using regional convolutional neural network</title><source>Institute of Physics:Jisc Collections:IOP Publishing Read and Publish 2024-2025 (Reading List)</source><creator>Dai, Xianjin ; Lei, Yang ; Wang, Tonghe ; Zhou, Jun ; Rudra, Soumon ; McDonald, Mark ; Curran, Walter J ; Liu, Tian ; Yang, Xiaofeng</creator><creatorcontrib>Dai, Xianjin ; Lei, Yang ; Wang, Tonghe ; Zhou, Jun ; Rudra, Soumon ; McDonald, Mark ; Curran, Walter J ; Liu, Tian ; Yang, Xiaofeng</creatorcontrib><description>Magnetic resonance imaging (MRI) allows accurate and reliable organ delineation for many disease sites in radiation therapy because MRI is able to offer superb soft-tissue contrast. Manual organ-at-risk delineation is labor-intensive and time-consuming. This study aims to develop a deep-learning-based automated multi-organ segmentation method to release the labor and accelerate the treatment planning process for head-and-neck (HN) cancer radiotherapy. A novel regional convolutional neural network (R-CNN) architecture, namely, mask scoring R-CNN, has been developed in this study. In the proposed model, a deep attention feature pyramid network is used as a backbone to extract the coarse features given by MRI, followed by feature refinement using R-CNN. The final segmentation is obtained through mask and mask scoring networks taking those refined feature maps as input. With the mask scoring mechanism incorporated into conventional mask supervision, the classification error can be highly minimized in conventional mask R-CNN architecture. A cohort of 60 HN cancer patients receiving external beam radiation therapy was used for experimental validation. Five-fold cross-validation was performed for the assessment of our proposed method. The Dice similarity coefficients of brain stem, left/right cochlea, left/right eye, larynx, left/right lens, mandible, optic chiasm, left/right optic nerve, oral cavity, left/right parotid, pharynx, and spinal cord were 0.89 ± 0.06, 0.68 ± 0.14/0.68 ± 0.18, 0.89 ± 0.07/0.89 ± 0.05, 0.90 ± 0.07, 0.67 ± 0.18/0.67 ± 0.10, 0.82 ± 0.10, 0.61 ± 0.14, 0.67 ± 0.11/0.68 ± 0.11, 0.92 ± 0.07, 0.85 ± 0.06/0.86 ± 0.05, 0.80 ± 0.13, and 0.77 ± 0.15, respectively. After the model training, all OARs can be segmented within 1 min.</description><identifier>ISSN: 0031-9155</identifier><identifier>ISSN: 1361-6560</identifier><identifier>EISSN: 1361-6560</identifier><identifier>DOI: 10.1088/1361-6560/ac3b34</identifier><identifier>PMID: 34794138</identifier><identifier>CODEN: PHMBA7</identifier><language>eng</language><publisher>England: IOP Publishing</publisher><subject>deep learning ; Head and Neck Neoplasms - diagnostic imaging ; Head and Neck Neoplasms - radiotherapy ; Humans ; image segmentation ; Magnetic Resonance Imaging ; Neural Networks, Computer ; Organs at Risk - diagnostic imaging ; radiation therapy ; Tomography, X-Ray Computed</subject><ispartof>Physics in medicine & biology, 2022-01, Vol.67 (2), p.25006</ispartof><rights>2022 Institute of Physics and Engineering in Medicine</rights><rights>2022 Institute of Physics and Engineering in Medicine.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c476t-d0e4e2a91ca38cef9e1fe7838a7519b4e4b3e7066030eee2a86fa72971b1e2d13</citedby><cites>FETCH-LOGICAL-c476t-d0e4e2a91ca38cef9e1fe7838a7519b4e4b3e7066030eee2a86fa72971b1e2d13</cites><orcidid>0000-0001-9023-5855</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/34794138$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Dai, Xianjin</creatorcontrib><creatorcontrib>Lei, Yang</creatorcontrib><creatorcontrib>Wang, Tonghe</creatorcontrib><creatorcontrib>Zhou, Jun</creatorcontrib><creatorcontrib>Rudra, Soumon</creatorcontrib><creatorcontrib>McDonald, Mark</creatorcontrib><creatorcontrib>Curran, Walter J</creatorcontrib><creatorcontrib>Liu, Tian</creatorcontrib><creatorcontrib>Yang, Xiaofeng</creatorcontrib><title>Multi-organ auto-delineation in head-and-neck MRI for radiation therapy using regional convolutional neural network</title><title>Physics in medicine & biology</title><addtitle>PMB</addtitle><addtitle>Phys. Med. Biol</addtitle><description>Magnetic resonance imaging (MRI) allows accurate and reliable organ delineation for many disease sites in radiation therapy because MRI is able to offer superb soft-tissue contrast. Manual organ-at-risk delineation is labor-intensive and time-consuming. This study aims to develop a deep-learning-based automated multi-organ segmentation method to release the labor and accelerate the treatment planning process for head-and-neck (HN) cancer radiotherapy. A novel regional convolutional neural network (R-CNN) architecture, namely, mask scoring R-CNN, has been developed in this study. In the proposed model, a deep attention feature pyramid network is used as a backbone to extract the coarse features given by MRI, followed by feature refinement using R-CNN. The final segmentation is obtained through mask and mask scoring networks taking those refined feature maps as input. With the mask scoring mechanism incorporated into conventional mask supervision, the classification error can be highly minimized in conventional mask R-CNN architecture. A cohort of 60 HN cancer patients receiving external beam radiation therapy was used for experimental validation. Five-fold cross-validation was performed for the assessment of our proposed method. The Dice similarity coefficients of brain stem, left/right cochlea, left/right eye, larynx, left/right lens, mandible, optic chiasm, left/right optic nerve, oral cavity, left/right parotid, pharynx, and spinal cord were 0.89 ± 0.06, 0.68 ± 0.14/0.68 ± 0.18, 0.89 ± 0.07/0.89 ± 0.05, 0.90 ± 0.07, 0.67 ± 0.18/0.67 ± 0.10, 0.82 ± 0.10, 0.61 ± 0.14, 0.67 ± 0.11/0.68 ± 0.11, 0.92 ± 0.07, 0.85 ± 0.06/0.86 ± 0.05, 0.80 ± 0.13, and 0.77 ± 0.15, respectively. After the model training, all OARs can be segmented within 1 min.</description><subject>deep learning</subject><subject>Head and Neck Neoplasms - diagnostic imaging</subject><subject>Head and Neck Neoplasms - radiotherapy</subject><subject>Humans</subject><subject>image segmentation</subject><subject>Magnetic Resonance Imaging</subject><subject>Neural Networks, Computer</subject><subject>Organs at Risk - diagnostic imaging</subject><subject>radiation therapy</subject><subject>Tomography, X-Ray Computed</subject><issn>0031-9155</issn><issn>1361-6560</issn><issn>1361-6560</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><recordid>eNp9kE1P3TAQRS1EBa-0e1bIO7qoy0ycOM4SoX4ggSpV7dpyksnDkGcHO27Fv29eQ1lVXV3N6Ny7OIydInxA0PoCpUKhKgUXtpOtLA_Y5uV1yDYAEkWDVXXMXqd0D4Coi_KIHcuybkqUesPSbR5nJ0LcWs9tnoPoaXSe7OyC587zO7K9sL4XnroHfvvtmg8h8mh7tyLzHUU7PfGcnN_ySNvlaUfeBf8zjHleL085_on5V4gPb9irwY6J3j7nCfvx6eP3qy_i5uvn66vLG9GVtZpFD1RSYRvsrNQdDQ3hQLWW2tYVNm1JZSupBqVAAtFCajXYumhqbJGKHuUJe7fuTjE8Zkqz2bnU0ThaTyEnU1RNgxoR1ILCinYxpBRpMFN0OxufDILZqzZ7r2bv1ayql8rZ83pud9S_FP66XYD3K-DCZO5DjouJ9L-983_g0641qjaFgaICUGbqB_kbFC6W4Q</recordid><startdate>20220121</startdate><enddate>20220121</enddate><creator>Dai, Xianjin</creator><creator>Lei, Yang</creator><creator>Wang, Tonghe</creator><creator>Zhou, Jun</creator><creator>Rudra, Soumon</creator><creator>McDonald, Mark</creator><creator>Curran, Walter J</creator><creator>Liu, Tian</creator><creator>Yang, Xiaofeng</creator><general>IOP Publishing</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>7X8</scope><orcidid>https://orcid.org/0000-0001-9023-5855</orcidid></search><sort><creationdate>20220121</creationdate><title>Multi-organ auto-delineation in head-and-neck MRI for radiation therapy using regional convolutional neural network</title><author>Dai, Xianjin ; Lei, Yang ; Wang, Tonghe ; Zhou, Jun ; Rudra, Soumon ; McDonald, Mark ; Curran, Walter J ; Liu, Tian ; Yang, Xiaofeng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c476t-d0e4e2a91ca38cef9e1fe7838a7519b4e4b3e7066030eee2a86fa72971b1e2d13</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>deep learning</topic><topic>Head and Neck Neoplasms - diagnostic imaging</topic><topic>Head and Neck Neoplasms - radiotherapy</topic><topic>Humans</topic><topic>image segmentation</topic><topic>Magnetic Resonance Imaging</topic><topic>Neural Networks, Computer</topic><topic>Organs at Risk - diagnostic imaging</topic><topic>radiation therapy</topic><topic>Tomography, X-Ray Computed</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dai, Xianjin</creatorcontrib><creatorcontrib>Lei, Yang</creatorcontrib><creatorcontrib>Wang, Tonghe</creatorcontrib><creatorcontrib>Zhou, Jun</creatorcontrib><creatorcontrib>Rudra, Soumon</creatorcontrib><creatorcontrib>McDonald, Mark</creatorcontrib><creatorcontrib>Curran, Walter J</creatorcontrib><creatorcontrib>Liu, Tian</creatorcontrib><creatorcontrib>Yang, Xiaofeng</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>MEDLINE - Academic</collection><jtitle>Physics in medicine & biology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dai, Xianjin</au><au>Lei, Yang</au><au>Wang, Tonghe</au><au>Zhou, Jun</au><au>Rudra, Soumon</au><au>McDonald, Mark</au><au>Curran, Walter J</au><au>Liu, Tian</au><au>Yang, Xiaofeng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Multi-organ auto-delineation in head-and-neck MRI for radiation therapy using regional convolutional neural network</atitle><jtitle>Physics in medicine & biology</jtitle><stitle>PMB</stitle><addtitle>Phys. Med. Biol</addtitle><date>2022-01-21</date><risdate>2022</risdate><volume>67</volume><issue>2</issue><spage>25006</spage><pages>25006-</pages><issn>0031-9155</issn><issn>1361-6560</issn><eissn>1361-6560</eissn><coden>PHMBA7</coden><abstract>Magnetic resonance imaging (MRI) allows accurate and reliable organ delineation for many disease sites in radiation therapy because MRI is able to offer superb soft-tissue contrast. Manual organ-at-risk delineation is labor-intensive and time-consuming. This study aims to develop a deep-learning-based automated multi-organ segmentation method to release the labor and accelerate the treatment planning process for head-and-neck (HN) cancer radiotherapy. A novel regional convolutional neural network (R-CNN) architecture, namely, mask scoring R-CNN, has been developed in this study. In the proposed model, a deep attention feature pyramid network is used as a backbone to extract the coarse features given by MRI, followed by feature refinement using R-CNN. The final segmentation is obtained through mask and mask scoring networks taking those refined feature maps as input. With the mask scoring mechanism incorporated into conventional mask supervision, the classification error can be highly minimized in conventional mask R-CNN architecture. A cohort of 60 HN cancer patients receiving external beam radiation therapy was used for experimental validation. Five-fold cross-validation was performed for the assessment of our proposed method. The Dice similarity coefficients of brain stem, left/right cochlea, left/right eye, larynx, left/right lens, mandible, optic chiasm, left/right optic nerve, oral cavity, left/right parotid, pharynx, and spinal cord were 0.89 ± 0.06, 0.68 ± 0.14/0.68 ± 0.18, 0.89 ± 0.07/0.89 ± 0.05, 0.90 ± 0.07, 0.67 ± 0.18/0.67 ± 0.10, 0.82 ± 0.10, 0.61 ± 0.14, 0.67 ± 0.11/0.68 ± 0.11, 0.92 ± 0.07, 0.85 ± 0.06/0.86 ± 0.05, 0.80 ± 0.13, and 0.77 ± 0.15, respectively. After the model training, all OARs can be segmented within 1 min.</abstract><cop>England</cop><pub>IOP Publishing</pub><pmid>34794138</pmid><doi>10.1088/1361-6560/ac3b34</doi><tpages>12</tpages><orcidid>https://orcid.org/0000-0001-9023-5855</orcidid><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0031-9155 |
ispartof | Physics in medicine & biology, 2022-01, Vol.67 (2), p.25006 |
issn | 0031-9155 1361-6560 1361-6560 |
language | eng |
recordid | cdi_proquest_miscellaneous_2599181106 |
source | Institute of Physics:Jisc Collections:IOP Publishing Read and Publish 2024-2025 (Reading List) |
subjects | deep learning Head and Neck Neoplasms - diagnostic imaging Head and Neck Neoplasms - radiotherapy Humans image segmentation Magnetic Resonance Imaging Neural Networks, Computer Organs at Risk - diagnostic imaging radiation therapy Tomography, X-Ray Computed |
title | Multi-organ auto-delineation in head-and-neck MRI for radiation therapy using regional convolutional neural network |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-04T02%3A16%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_pubme&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Multi-organ%20auto-delineation%20in%20head-and-neck%20MRI%20for%20radiation%20therapy%20using%20regional%20convolutional%20neural%20network&rft.jtitle=Physics%20in%20medicine%20&%20biology&rft.au=Dai,%20Xianjin&rft.date=2022-01-21&rft.volume=67&rft.issue=2&rft.spage=25006&rft.pages=25006-&rft.issn=0031-9155&rft.eissn=1361-6560&rft.coden=PHMBA7&rft_id=info:doi/10.1088/1361-6560/ac3b34&rft_dat=%3Cproquest_pubme%3E2599181106%3C/proquest_pubme%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c476t-d0e4e2a91ca38cef9e1fe7838a7519b4e4b3e7066030eee2a86fa72971b1e2d13%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2599181106&rft_id=info:pmid/34794138&rfr_iscdi=true |