Loading…
A joint convolutional-recurrent neural network with an attention mechanism for detecting intracranial hemorrhage on noncontrast head CT
To investigate the performance of a joint convolutional neural networks-recurrent neural networks (CNN-RNN) using an attention mechanism in identifying and classifying intracranial hemorrhage (ICH) on a large multi-center dataset; to test its performance in a prospective independent sample consistin...
Saved in:
Published in: | Scientific reports 2022-02, Vol.12 (1), p.2084-2084, Article 2084 |
---|---|
Main Authors: | , , , , , , , , , , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c540t-e31ce967b1e435cf034a30d9f8e933fa2e840a120a2ff5fe27322bd6853b0c833 |
---|---|
cites | cdi_FETCH-LOGICAL-c540t-e31ce967b1e435cf034a30d9f8e933fa2e840a120a2ff5fe27322bd6853b0c833 |
container_end_page | 2084 |
container_issue | 1 |
container_start_page | 2084 |
container_title | Scientific reports |
container_volume | 12 |
creator | Alis, Deniz Alis, Ceren Yergin, Mert Topel, Cagdas Asmakutlu, Ozan Bagcilar, Omer Senli, Yeseren Deniz Ustundag, Ahmet Salt, Vefa Dogan, Sebahat Nacar Velioglu, Murat Selcuk, Hakan Hatem Kara, Batuhan Ozer, Caner Oksuz, Ilkay Kizilkilic, Osman Karaarslan, Ercan |
description | To investigate the performance of a joint convolutional neural networks-recurrent neural networks (CNN-RNN) using an attention mechanism in identifying and classifying intracranial hemorrhage (ICH) on a large multi-center dataset; to test its performance in a prospective independent sample consisting of consecutive real-world patients. All consecutive patients who underwent emergency non-contrast-enhanced head CT in five different centers were retrospectively gathered. Five neuroradiologists created the ground-truth labels. The development dataset was divided into the training and validation set. After the development phase, we integrated the deep learning model into an independent center’s PACS environment for over six months for assessing the performance in a real clinical setting. Three radiologists created the ground-truth labels of the testing set with a majority voting. A total of 55,179 head CT scans of 48,070 patients, 28,253 men (58.77%), with a mean age of 53.84 ± 17.64 years (range 18–89) were enrolled in the study. The validation sample comprised 5211 head CT scans, with 991 being annotated as ICH-positive. The model's binary accuracy, sensitivity, and specificity on the validation set were 99.41%, 99.70%, and 98.91, respectively. During the prospective implementation, the model yielded an accuracy of 96.02% on 452 head CT scans with an average prediction time of 45 ± 8 s. The joint CNN-RNN model with an attention mechanism yielded excellent diagnostic accuracy in assessing ICH and its subtypes on a large-scale sample. The model was seamlessly integrated into the radiology workflow. Though slightly decreased performance, it provided decisions on the sample of consecutive real-world patients within a minute. |
doi_str_mv | 10.1038/s41598-022-05872-x |
format | article |
fullrecord | <record><control><sourceid>proquest_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_40d9024177f54153a115a4f0937e7097</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><doaj_id>oai_doaj_org_article_40d9024177f54153a115a4f0937e7097</doaj_id><sourcerecordid>2626564723</sourcerecordid><originalsourceid>FETCH-LOGICAL-c540t-e31ce967b1e435cf034a30d9f8e933fa2e840a120a2ff5fe27322bd6853b0c833</originalsourceid><addsrcrecordid>eNp9ks1u1DAUhSMEotXQF2CBLLFhE_BvfjZI1aiFSpXYlLXlca4zGRK72E5bnoDX7p1JKS0LvLHlc-53c51TFG8Z_cioaD4lyVTblJTzkqqm5uXdi-KYU6lKLjh_-eR8VJyktKO4FG8la18XR0IxUTEujovfp2QXBp-JDf4mjHMegjdjGcHOMQLee5ijGXHLtyH-ILdD3hLjickZVTSTCezW-CFNxIVIOshg8-B7gtBobEQJy7cwhRi3pgeCJT54bIdyyqiYjqyv3hSvnBkTnDzsq-L7-dnV-mt5-e3Lxfr0srRK0lyCYBbaqt4wkEJZR4U0gnata6AVwhkOjaSGcWq4c8oBr_EBNl3VKLGhthFiVVws3C6Ynb6Ow2TiLx3MoA8XIfbaxDzYEbRELuWS1bVT-NjCMKaMdLQVNdS0rZH1eWFdz5sJOgv7kcZn0OeKH7a6Dze6aXglWoqADw-AGH7OkLKehmRhHI2HMCfNK14zoRpsuSre_2PdhTnirzq4KlXJmu-n44vLxpBSBPf4MYzqfWz0EhuNsdGH2Og7LHr3dIzHkj8hQYNYDAkl30P82_s_2HuZstAv</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2626564723</pqid></control><display><type>article</type><title>A joint convolutional-recurrent neural network with an attention mechanism for detecting intracranial hemorrhage on noncontrast head CT</title><source>Publicly Available Content Database</source><source>PubMed Central</source><source>Free Full-Text Journals in Chemistry</source><source>Springer Nature - nature.com Journals - Fully Open Access</source><creator>Alis, Deniz ; Alis, Ceren ; Yergin, Mert ; Topel, Cagdas ; Asmakutlu, Ozan ; Bagcilar, Omer ; Senli, Yeseren Deniz ; Ustundag, Ahmet ; Salt, Vefa ; Dogan, Sebahat Nacar ; Velioglu, Murat ; Selcuk, Hakan Hatem ; Kara, Batuhan ; Ozer, Caner ; Oksuz, Ilkay ; Kizilkilic, Osman ; Karaarslan, Ercan</creator><creatorcontrib>Alis, Deniz ; Alis, Ceren ; Yergin, Mert ; Topel, Cagdas ; Asmakutlu, Ozan ; Bagcilar, Omer ; Senli, Yeseren Deniz ; Ustundag, Ahmet ; Salt, Vefa ; Dogan, Sebahat Nacar ; Velioglu, Murat ; Selcuk, Hakan Hatem ; Kara, Batuhan ; Ozer, Caner ; Oksuz, Ilkay ; Kizilkilic, Osman ; Karaarslan, Ercan</creatorcontrib><description>To investigate the performance of a joint convolutional neural networks-recurrent neural networks (CNN-RNN) using an attention mechanism in identifying and classifying intracranial hemorrhage (ICH) on a large multi-center dataset; to test its performance in a prospective independent sample consisting of consecutive real-world patients. All consecutive patients who underwent emergency non-contrast-enhanced head CT in five different centers were retrospectively gathered. Five neuroradiologists created the ground-truth labels. The development dataset was divided into the training and validation set. After the development phase, we integrated the deep learning model into an independent center’s PACS environment for over six months for assessing the performance in a real clinical setting. Three radiologists created the ground-truth labels of the testing set with a majority voting. A total of 55,179 head CT scans of 48,070 patients, 28,253 men (58.77%), with a mean age of 53.84 ± 17.64 years (range 18–89) were enrolled in the study. The validation sample comprised 5211 head CT scans, with 991 being annotated as ICH-positive. The model's binary accuracy, sensitivity, and specificity on the validation set were 99.41%, 99.70%, and 98.91, respectively. During the prospective implementation, the model yielded an accuracy of 96.02% on 452 head CT scans with an average prediction time of 45 ± 8 s. The joint CNN-RNN model with an attention mechanism yielded excellent diagnostic accuracy in assessing ICH and its subtypes on a large-scale sample. The model was seamlessly integrated into the radiology workflow. Though slightly decreased performance, it provided decisions on the sample of consecutive real-world patients within a minute.</description><identifier>ISSN: 2045-2322</identifier><identifier>EISSN: 2045-2322</identifier><identifier>DOI: 10.1038/s41598-022-05872-x</identifier><identifier>PMID: 35136123</identifier><language>eng</language><publisher>London: Nature Publishing Group UK</publisher><subject>639/705/117 ; 692/617/375/1370/534 ; Accuracy ; Adolescent ; Adult ; Aged ; Aged, 80 and over ; Deep Learning ; Emergency Service, Hospital ; Female ; Head ; Hemorrhage ; Humanities and Social Sciences ; Humans ; Intracranial Hemorrhage, Traumatic - diagnostic imaging ; Male ; Medical imaging ; Middle Aged ; multidisciplinary ; Neural networks ; Patients ; Performance assessment ; Prospective Studies ; Radiology ; Retrospective Studies ; Science ; Science (multidisciplinary) ; Tomography, X-Ray Computed ; Young Adult</subject><ispartof>Scientific reports, 2022-02, Vol.12 (1), p.2084-2084, Article 2084</ispartof><rights>The Author(s) 2022</rights><rights>2022. The Author(s).</rights><rights>The Author(s) 2022. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c540t-e31ce967b1e435cf034a30d9f8e933fa2e840a120a2ff5fe27322bd6853b0c833</citedby><cites>FETCH-LOGICAL-c540t-e31ce967b1e435cf034a30d9f8e933fa2e840a120a2ff5fe27322bd6853b0c833</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2626564723/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2626564723?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/35136123$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Alis, Deniz</creatorcontrib><creatorcontrib>Alis, Ceren</creatorcontrib><creatorcontrib>Yergin, Mert</creatorcontrib><creatorcontrib>Topel, Cagdas</creatorcontrib><creatorcontrib>Asmakutlu, Ozan</creatorcontrib><creatorcontrib>Bagcilar, Omer</creatorcontrib><creatorcontrib>Senli, Yeseren Deniz</creatorcontrib><creatorcontrib>Ustundag, Ahmet</creatorcontrib><creatorcontrib>Salt, Vefa</creatorcontrib><creatorcontrib>Dogan, Sebahat Nacar</creatorcontrib><creatorcontrib>Velioglu, Murat</creatorcontrib><creatorcontrib>Selcuk, Hakan Hatem</creatorcontrib><creatorcontrib>Kara, Batuhan</creatorcontrib><creatorcontrib>Ozer, Caner</creatorcontrib><creatorcontrib>Oksuz, Ilkay</creatorcontrib><creatorcontrib>Kizilkilic, Osman</creatorcontrib><creatorcontrib>Karaarslan, Ercan</creatorcontrib><title>A joint convolutional-recurrent neural network with an attention mechanism for detecting intracranial hemorrhage on noncontrast head CT</title><title>Scientific reports</title><addtitle>Sci Rep</addtitle><addtitle>Sci Rep</addtitle><description>To investigate the performance of a joint convolutional neural networks-recurrent neural networks (CNN-RNN) using an attention mechanism in identifying and classifying intracranial hemorrhage (ICH) on a large multi-center dataset; to test its performance in a prospective independent sample consisting of consecutive real-world patients. All consecutive patients who underwent emergency non-contrast-enhanced head CT in five different centers were retrospectively gathered. Five neuroradiologists created the ground-truth labels. The development dataset was divided into the training and validation set. After the development phase, we integrated the deep learning model into an independent center’s PACS environment for over six months for assessing the performance in a real clinical setting. Three radiologists created the ground-truth labels of the testing set with a majority voting. A total of 55,179 head CT scans of 48,070 patients, 28,253 men (58.77%), with a mean age of 53.84 ± 17.64 years (range 18–89) were enrolled in the study. The validation sample comprised 5211 head CT scans, with 991 being annotated as ICH-positive. The model's binary accuracy, sensitivity, and specificity on the validation set were 99.41%, 99.70%, and 98.91, respectively. During the prospective implementation, the model yielded an accuracy of 96.02% on 452 head CT scans with an average prediction time of 45 ± 8 s. The joint CNN-RNN model with an attention mechanism yielded excellent diagnostic accuracy in assessing ICH and its subtypes on a large-scale sample. The model was seamlessly integrated into the radiology workflow. Though slightly decreased performance, it provided decisions on the sample of consecutive real-world patients within a minute.</description><subject>639/705/117</subject><subject>692/617/375/1370/534</subject><subject>Accuracy</subject><subject>Adolescent</subject><subject>Adult</subject><subject>Aged</subject><subject>Aged, 80 and over</subject><subject>Deep Learning</subject><subject>Emergency Service, Hospital</subject><subject>Female</subject><subject>Head</subject><subject>Hemorrhage</subject><subject>Humanities and Social Sciences</subject><subject>Humans</subject><subject>Intracranial Hemorrhage, Traumatic - diagnostic imaging</subject><subject>Male</subject><subject>Medical imaging</subject><subject>Middle Aged</subject><subject>multidisciplinary</subject><subject>Neural networks</subject><subject>Patients</subject><subject>Performance assessment</subject><subject>Prospective Studies</subject><subject>Radiology</subject><subject>Retrospective Studies</subject><subject>Science</subject><subject>Science (multidisciplinary)</subject><subject>Tomography, X-Ray Computed</subject><subject>Young Adult</subject><issn>2045-2322</issn><issn>2045-2322</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNp9ks1u1DAUhSMEotXQF2CBLLFhE_BvfjZI1aiFSpXYlLXlca4zGRK72E5bnoDX7p1JKS0LvLHlc-53c51TFG8Z_cioaD4lyVTblJTzkqqm5uXdi-KYU6lKLjh_-eR8VJyktKO4FG8la18XR0IxUTEujovfp2QXBp-JDf4mjHMegjdjGcHOMQLee5ijGXHLtyH-ILdD3hLjickZVTSTCezW-CFNxIVIOshg8-B7gtBobEQJy7cwhRi3pgeCJT54bIdyyqiYjqyv3hSvnBkTnDzsq-L7-dnV-mt5-e3Lxfr0srRK0lyCYBbaqt4wkEJZR4U0gnata6AVwhkOjaSGcWq4c8oBr_EBNl3VKLGhthFiVVws3C6Ynb6Ow2TiLx3MoA8XIfbaxDzYEbRELuWS1bVT-NjCMKaMdLQVNdS0rZH1eWFdz5sJOgv7kcZn0OeKH7a6Dze6aXglWoqADw-AGH7OkLKehmRhHI2HMCfNK14zoRpsuSre_2PdhTnirzq4KlXJmu-n44vLxpBSBPf4MYzqfWz0EhuNsdGH2Og7LHr3dIzHkj8hQYNYDAkl30P82_s_2HuZstAv</recordid><startdate>20220208</startdate><enddate>20220208</enddate><creator>Alis, Deniz</creator><creator>Alis, Ceren</creator><creator>Yergin, Mert</creator><creator>Topel, Cagdas</creator><creator>Asmakutlu, Ozan</creator><creator>Bagcilar, Omer</creator><creator>Senli, Yeseren Deniz</creator><creator>Ustundag, Ahmet</creator><creator>Salt, Vefa</creator><creator>Dogan, Sebahat Nacar</creator><creator>Velioglu, Murat</creator><creator>Selcuk, Hakan Hatem</creator><creator>Kara, Batuhan</creator><creator>Ozer, Caner</creator><creator>Oksuz, Ilkay</creator><creator>Kizilkilic, Osman</creator><creator>Karaarslan, Ercan</creator><general>Nature Publishing Group UK</general><general>Nature Publishing Group</general><general>Nature Portfolio</general><scope>C6C</scope><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88A</scope><scope>88E</scope><scope>88I</scope><scope>8FE</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BHPHI</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>LK8</scope><scope>M0S</scope><scope>M1P</scope><scope>M2P</scope><scope>M7P</scope><scope>PHGZM</scope><scope>PHGZT</scope><scope>PIMPY</scope><scope>PJZUB</scope><scope>PKEHL</scope><scope>PPXIY</scope><scope>PQEST</scope><scope>PQGLB</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>Q9U</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20220208</creationdate><title>A joint convolutional-recurrent neural network with an attention mechanism for detecting intracranial hemorrhage on noncontrast head CT</title><author>Alis, Deniz ; Alis, Ceren ; Yergin, Mert ; Topel, Cagdas ; Asmakutlu, Ozan ; Bagcilar, Omer ; Senli, Yeseren Deniz ; Ustundag, Ahmet ; Salt, Vefa ; Dogan, Sebahat Nacar ; Velioglu, Murat ; Selcuk, Hakan Hatem ; Kara, Batuhan ; Ozer, Caner ; Oksuz, Ilkay ; Kizilkilic, Osman ; Karaarslan, Ercan</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c540t-e31ce967b1e435cf034a30d9f8e933fa2e840a120a2ff5fe27322bd6853b0c833</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>639/705/117</topic><topic>692/617/375/1370/534</topic><topic>Accuracy</topic><topic>Adolescent</topic><topic>Adult</topic><topic>Aged</topic><topic>Aged, 80 and over</topic><topic>Deep Learning</topic><topic>Emergency Service, Hospital</topic><topic>Female</topic><topic>Head</topic><topic>Hemorrhage</topic><topic>Humanities and Social Sciences</topic><topic>Humans</topic><topic>Intracranial Hemorrhage, Traumatic - diagnostic imaging</topic><topic>Male</topic><topic>Medical imaging</topic><topic>Middle Aged</topic><topic>multidisciplinary</topic><topic>Neural networks</topic><topic>Patients</topic><topic>Performance assessment</topic><topic>Prospective Studies</topic><topic>Radiology</topic><topic>Retrospective Studies</topic><topic>Science</topic><topic>Science (multidisciplinary)</topic><topic>Tomography, X-Ray Computed</topic><topic>Young Adult</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Alis, Deniz</creatorcontrib><creatorcontrib>Alis, Ceren</creatorcontrib><creatorcontrib>Yergin, Mert</creatorcontrib><creatorcontrib>Topel, Cagdas</creatorcontrib><creatorcontrib>Asmakutlu, Ozan</creatorcontrib><creatorcontrib>Bagcilar, Omer</creatorcontrib><creatorcontrib>Senli, Yeseren Deniz</creatorcontrib><creatorcontrib>Ustundag, Ahmet</creatorcontrib><creatorcontrib>Salt, Vefa</creatorcontrib><creatorcontrib>Dogan, Sebahat Nacar</creatorcontrib><creatorcontrib>Velioglu, Murat</creatorcontrib><creatorcontrib>Selcuk, Hakan Hatem</creatorcontrib><creatorcontrib>Kara, Batuhan</creatorcontrib><creatorcontrib>Ozer, Caner</creatorcontrib><creatorcontrib>Oksuz, Ilkay</creatorcontrib><creatorcontrib>Kizilkilic, Osman</creatorcontrib><creatorcontrib>Karaarslan, Ercan</creatorcontrib><collection>Springer Nature OA Free Journals</collection><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health & Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Biology Database (Alumni Edition)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>ProQuest Hospital Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Natural Science Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Biological Sciences</collection><collection>ProQuest Health & Medical Collection</collection><collection>PML(ProQuest Medical Library)</collection><collection>Science Database</collection><collection>Biological Science Database</collection><collection>ProQuest Central (New)</collection><collection>ProQuest One Academic (New)</collection><collection>Publicly Available Content Database</collection><collection>ProQuest Health & Medical Research Collection</collection><collection>ProQuest One Academic Middle East (New)</collection><collection>ProQuest One Health & Nursing</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Applied & Life Sciences</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Scientific reports</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Alis, Deniz</au><au>Alis, Ceren</au><au>Yergin, Mert</au><au>Topel, Cagdas</au><au>Asmakutlu, Ozan</au><au>Bagcilar, Omer</au><au>Senli, Yeseren Deniz</au><au>Ustundag, Ahmet</au><au>Salt, Vefa</au><au>Dogan, Sebahat Nacar</au><au>Velioglu, Murat</au><au>Selcuk, Hakan Hatem</au><au>Kara, Batuhan</au><au>Ozer, Caner</au><au>Oksuz, Ilkay</au><au>Kizilkilic, Osman</au><au>Karaarslan, Ercan</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A joint convolutional-recurrent neural network with an attention mechanism for detecting intracranial hemorrhage on noncontrast head CT</atitle><jtitle>Scientific reports</jtitle><stitle>Sci Rep</stitle><addtitle>Sci Rep</addtitle><date>2022-02-08</date><risdate>2022</risdate><volume>12</volume><issue>1</issue><spage>2084</spage><epage>2084</epage><pages>2084-2084</pages><artnum>2084</artnum><issn>2045-2322</issn><eissn>2045-2322</eissn><abstract>To investigate the performance of a joint convolutional neural networks-recurrent neural networks (CNN-RNN) using an attention mechanism in identifying and classifying intracranial hemorrhage (ICH) on a large multi-center dataset; to test its performance in a prospective independent sample consisting of consecutive real-world patients. All consecutive patients who underwent emergency non-contrast-enhanced head CT in five different centers were retrospectively gathered. Five neuroradiologists created the ground-truth labels. The development dataset was divided into the training and validation set. After the development phase, we integrated the deep learning model into an independent center’s PACS environment for over six months for assessing the performance in a real clinical setting. Three radiologists created the ground-truth labels of the testing set with a majority voting. A total of 55,179 head CT scans of 48,070 patients, 28,253 men (58.77%), with a mean age of 53.84 ± 17.64 years (range 18–89) were enrolled in the study. The validation sample comprised 5211 head CT scans, with 991 being annotated as ICH-positive. The model's binary accuracy, sensitivity, and specificity on the validation set were 99.41%, 99.70%, and 98.91, respectively. During the prospective implementation, the model yielded an accuracy of 96.02% on 452 head CT scans with an average prediction time of 45 ± 8 s. The joint CNN-RNN model with an attention mechanism yielded excellent diagnostic accuracy in assessing ICH and its subtypes on a large-scale sample. The model was seamlessly integrated into the radiology workflow. Though slightly decreased performance, it provided decisions on the sample of consecutive real-world patients within a minute.</abstract><cop>London</cop><pub>Nature Publishing Group UK</pub><pmid>35136123</pmid><doi>10.1038/s41598-022-05872-x</doi><tpages>1</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 2045-2322 |
ispartof | Scientific reports, 2022-02, Vol.12 (1), p.2084-2084, Article 2084 |
issn | 2045-2322 2045-2322 |
language | eng |
recordid | cdi_doaj_primary_oai_doaj_org_article_40d9024177f54153a115a4f0937e7097 |
source | Publicly Available Content Database; PubMed Central; Free Full-Text Journals in Chemistry; Springer Nature - nature.com Journals - Fully Open Access |
subjects | 639/705/117 692/617/375/1370/534 Accuracy Adolescent Adult Aged Aged, 80 and over Deep Learning Emergency Service, Hospital Female Head Hemorrhage Humanities and Social Sciences Humans Intracranial Hemorrhage, Traumatic - diagnostic imaging Male Medical imaging Middle Aged multidisciplinary Neural networks Patients Performance assessment Prospective Studies Radiology Retrospective Studies Science Science (multidisciplinary) Tomography, X-Ray Computed Young Adult |
title | A joint convolutional-recurrent neural network with an attention mechanism for detecting intracranial hemorrhage on noncontrast head CT |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-03-09T01%3A35%3A27IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20joint%20convolutional-recurrent%20neural%20network%20with%20an%20attention%20mechanism%20for%20detecting%20intracranial%20hemorrhage%20on%20noncontrast%20head%20CT&rft.jtitle=Scientific%20reports&rft.au=Alis,%20Deniz&rft.date=2022-02-08&rft.volume=12&rft.issue=1&rft.spage=2084&rft.epage=2084&rft.pages=2084-2084&rft.artnum=2084&rft.issn=2045-2322&rft.eissn=2045-2322&rft_id=info:doi/10.1038/s41598-022-05872-x&rft_dat=%3Cproquest_doaj_%3E2626564723%3C/proquest_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c540t-e31ce967b1e435cf034a30d9f8e933fa2e840a120a2ff5fe27322bd6853b0c833%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2626564723&rft_id=info:pmid/35136123&rfr_iscdi=true |