Loading…

Comparison of a machine and deep learning model for automated tumor annotation on digitized whole slide prostate cancer histology

One in eight men will be affected by prostate cancer (PCa) in their lives. While the current clinical standard prognostic marker for PCa is the Gleason score, it is subject to inter-reviewer variability. This study compares two machine learning methods for discriminating between cancerous regions on...

Full description

Saved in:
Bibliographic Details
Published in:PloS one 2023-03, Vol.18 (3), p.e0278084-e0278084
Main Authors: Duenweg, Savannah R, Brehler, Michael, Bobholz, Samuel A, Lowman, Allison K, Winiarz, Aleksandra, Kyereme, Fitzgerald, Nencka, Andrew, Iczkowski, Kenneth A, LaViolette, Peter S
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by cdi_FETCH-LOGICAL-c693t-380b2643947627fe777344a723dd8914e2e061af81a7b8c3857f6dde6a350023
cites cdi_FETCH-LOGICAL-c693t-380b2643947627fe777344a723dd8914e2e061af81a7b8c3857f6dde6a350023
container_end_page e0278084
container_issue 3
container_start_page e0278084
container_title PloS one
container_volume 18
creator Duenweg, Savannah R
Brehler, Michael
Bobholz, Samuel A
Lowman, Allison K
Winiarz, Aleksandra
Kyereme, Fitzgerald
Nencka, Andrew
Iczkowski, Kenneth A
LaViolette, Peter S
description One in eight men will be affected by prostate cancer (PCa) in their lives. While the current clinical standard prognostic marker for PCa is the Gleason score, it is subject to inter-reviewer variability. This study compares two machine learning methods for discriminating between cancerous regions on digitized histology from 47 PCa patients. Whole-slide images were annotated by a GU fellowship-trained pathologist for each Gleason pattern. High-resolution tiles were extracted from annotated and unlabeled tissue. Patients were separated into a training set of 31 patients (Cohort A, n = 9345 tiles) and a testing cohort of 16 patients (Cohort B, n = 4375 tiles). Tiles from Cohort A were used to train a ResNet model, and glands from these tiles were segmented to calculate pathomic features to train a bagged ensemble model to discriminate tumors as (1) cancer and noncancer, (2) high- and low-grade cancer from noncancer, and (3) all Gleason patterns. The outputs of these models were compared to ground-truth pathologist annotations. The ensemble and ResNet models had overall accuracies of 89% and 88%, respectively, at predicting cancer from noncancer. The ResNet model was additionally able to differentiate Gleason patterns on data from Cohort B while the ensemble model was not. Our results suggest that quantitative pathomic features calculated from PCa histology can distinguish regions of cancer; however, texture features captured by deep learning frameworks better differentiate unique Gleason patterns.
doi_str_mv 10.1371/journal.pone.0278084
format article
fullrecord <record><control><sourceid>gale_plos_</sourceid><recordid>TN_cdi_plos_journals_2787569104</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A741535714</galeid><doaj_id>oai_doaj_org_article_b9a800efcd584d3bad67a03bcda4feaf</doaj_id><sourcerecordid>A741535714</sourcerecordid><originalsourceid>FETCH-LOGICAL-c693t-380b2643947627fe777344a723dd8914e2e061af81a7b8c3857f6dde6a350023</originalsourceid><addsrcrecordid>eNqNk01v1DAQhiMEoqXwDxBYQkJw2MWOE9s5oario1KlSlBxtWbjSdaVYy-xA5Qb_xynu626qAfkg7-eeccznimK54wuGZfs3WWYRg9uuQkel7SUiqrqQXHIGl4uREn5wzvrg-JJjJeU1lwJ8bg44KIpVcnpYfHnJAwbGG0MnoSOABmgXVuPBLwhBnFDHMLore_JEAw60oWRwJTCAAkNSdMw770PCZKdNTwxtrfJ_s63P9fBIYnOGiSbMcTMIGnBtziStY0puNBfPS0edeAiPtvNR8XFxw8XJ58XZ-efTk-OzxataHhacEVXpah4U0lRyg6llLyqQJbcGNWwCkukgkGnGMiVarmqZSeMQQG8prTkR8XLrezGhah3yYs6503WomG0ysTpljABLvVmtAOMVzqA1dcHYew1jMm2DvWqAUUpdq2pVWX4CoyQQPmqNVB1CF3Wer_zNq0GNC36NILbE92_8Xat-_BDM0pZI3LER8WbncIYvk8Ykx5sbNE58Bim64crRZlo6oy--ge9P7wd1UOOwPouZMftLKqPZcVqXks2U8t7qDwMDrbNpdbZfL5n8HbPIDMJf6Uephj16dcv_8-ef9tnX99h1wgurWNw01xlcR-stmCbKyyO2N1mmVE9d8pNNvTcKXrXKdnsxd0fujW6aQ3-F-l1EBU</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2787569104</pqid></control><display><type>article</type><title>Comparison of a machine and deep learning model for automated tumor annotation on digitized whole slide prostate cancer histology</title><source>Access via ProQuest (Open Access)</source><source>PubMed Central</source><creator>Duenweg, Savannah R ; Brehler, Michael ; Bobholz, Samuel A ; Lowman, Allison K ; Winiarz, Aleksandra ; Kyereme, Fitzgerald ; Nencka, Andrew ; Iczkowski, Kenneth A ; LaViolette, Peter S</creator><contributor>Fraiwan, Mohammad Amin</contributor><creatorcontrib>Duenweg, Savannah R ; Brehler, Michael ; Bobholz, Samuel A ; Lowman, Allison K ; Winiarz, Aleksandra ; Kyereme, Fitzgerald ; Nencka, Andrew ; Iczkowski, Kenneth A ; LaViolette, Peter S ; Fraiwan, Mohammad Amin</creatorcontrib><description>One in eight men will be affected by prostate cancer (PCa) in their lives. While the current clinical standard prognostic marker for PCa is the Gleason score, it is subject to inter-reviewer variability. This study compares two machine learning methods for discriminating between cancerous regions on digitized histology from 47 PCa patients. Whole-slide images were annotated by a GU fellowship-trained pathologist for each Gleason pattern. High-resolution tiles were extracted from annotated and unlabeled tissue. Patients were separated into a training set of 31 patients (Cohort A, n = 9345 tiles) and a testing cohort of 16 patients (Cohort B, n = 4375 tiles). Tiles from Cohort A were used to train a ResNet model, and glands from these tiles were segmented to calculate pathomic features to train a bagged ensemble model to discriminate tumors as (1) cancer and noncancer, (2) high- and low-grade cancer from noncancer, and (3) all Gleason patterns. The outputs of these models were compared to ground-truth pathologist annotations. The ensemble and ResNet models had overall accuracies of 89% and 88%, respectively, at predicting cancer from noncancer. The ResNet model was additionally able to differentiate Gleason patterns on data from Cohort B while the ensemble model was not. Our results suggest that quantitative pathomic features calculated from PCa histology can distinguish regions of cancer; however, texture features captured by deep learning frameworks better differentiate unique Gleason patterns.</description><identifier>ISSN: 1932-6203</identifier><identifier>EISSN: 1932-6203</identifier><identifier>DOI: 10.1371/journal.pone.0278084</identifier><identifier>PMID: 36928230</identifier><language>eng</language><publisher>United States: Public Library of Science</publisher><subject>Accuracy ; Algorithms ; Annotations ; Artificial intelligence ; Automation ; Biology and Life Sciences ; Cancer therapies ; Comparative analysis ; Computer and Information Sciences ; Computer-aided medical diagnosis ; Datasets ; Deep Learning ; Diagnosis ; Digitization ; Histology ; Humans ; Learning algorithms ; Machine Learning ; Male ; Medical diagnosis ; Medical prognosis ; Medical research ; Medical screening ; Medicine and Health Sciences ; Methods ; Microscopy, Medical ; Modelling ; Neoplasm Grading ; Neural networks ; Pathology ; Patients ; Prognosis ; Prostate - pathology ; Prostate cancer ; Prostatic Neoplasms - pathology ; Regression analysis ; Technology application ; Tiles ; Tumors</subject><ispartof>PloS one, 2023-03, Vol.18 (3), p.e0278084-e0278084</ispartof><rights>Copyright: © 2023 Duenweg et al. This is an open access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.</rights><rights>COPYRIGHT 2023 Public Library of Science</rights><rights>2023 Duenweg et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>2023 Duenweg et al 2023 Duenweg et al</rights><rights>2023 Duenweg et al. This is an open access article distributed under the terms of the Creative Commons Attribution License: http://creativecommons.org/licenses/by/4.0/ (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c693t-380b2643947627fe777344a723dd8914e2e061af81a7b8c3857f6dde6a350023</citedby><cites>FETCH-LOGICAL-c693t-380b2643947627fe777344a723dd8914e2e061af81a7b8c3857f6dde6a350023</cites><orcidid>0000-0003-4010-7737 ; 0000-0002-9602-6891</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.proquest.com/docview/2787569104/fulltextPDF?pq-origsite=primo$$EPDF$$P50$$Gproquest$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2787569104?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,44590,53791,53793,75126</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/36928230$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><contributor>Fraiwan, Mohammad Amin</contributor><creatorcontrib>Duenweg, Savannah R</creatorcontrib><creatorcontrib>Brehler, Michael</creatorcontrib><creatorcontrib>Bobholz, Samuel A</creatorcontrib><creatorcontrib>Lowman, Allison K</creatorcontrib><creatorcontrib>Winiarz, Aleksandra</creatorcontrib><creatorcontrib>Kyereme, Fitzgerald</creatorcontrib><creatorcontrib>Nencka, Andrew</creatorcontrib><creatorcontrib>Iczkowski, Kenneth A</creatorcontrib><creatorcontrib>LaViolette, Peter S</creatorcontrib><title>Comparison of a machine and deep learning model for automated tumor annotation on digitized whole slide prostate cancer histology</title><title>PloS one</title><addtitle>PLoS One</addtitle><description>One in eight men will be affected by prostate cancer (PCa) in their lives. While the current clinical standard prognostic marker for PCa is the Gleason score, it is subject to inter-reviewer variability. This study compares two machine learning methods for discriminating between cancerous regions on digitized histology from 47 PCa patients. Whole-slide images were annotated by a GU fellowship-trained pathologist for each Gleason pattern. High-resolution tiles were extracted from annotated and unlabeled tissue. Patients were separated into a training set of 31 patients (Cohort A, n = 9345 tiles) and a testing cohort of 16 patients (Cohort B, n = 4375 tiles). Tiles from Cohort A were used to train a ResNet model, and glands from these tiles were segmented to calculate pathomic features to train a bagged ensemble model to discriminate tumors as (1) cancer and noncancer, (2) high- and low-grade cancer from noncancer, and (3) all Gleason patterns. The outputs of these models were compared to ground-truth pathologist annotations. The ensemble and ResNet models had overall accuracies of 89% and 88%, respectively, at predicting cancer from noncancer. The ResNet model was additionally able to differentiate Gleason patterns on data from Cohort B while the ensemble model was not. Our results suggest that quantitative pathomic features calculated from PCa histology can distinguish regions of cancer; however, texture features captured by deep learning frameworks better differentiate unique Gleason patterns.</description><subject>Accuracy</subject><subject>Algorithms</subject><subject>Annotations</subject><subject>Artificial intelligence</subject><subject>Automation</subject><subject>Biology and Life Sciences</subject><subject>Cancer therapies</subject><subject>Comparative analysis</subject><subject>Computer and Information Sciences</subject><subject>Computer-aided medical diagnosis</subject><subject>Datasets</subject><subject>Deep Learning</subject><subject>Diagnosis</subject><subject>Digitization</subject><subject>Histology</subject><subject>Humans</subject><subject>Learning algorithms</subject><subject>Machine Learning</subject><subject>Male</subject><subject>Medical diagnosis</subject><subject>Medical prognosis</subject><subject>Medical research</subject><subject>Medical screening</subject><subject>Medicine and Health Sciences</subject><subject>Methods</subject><subject>Microscopy, Medical</subject><subject>Modelling</subject><subject>Neoplasm Grading</subject><subject>Neural networks</subject><subject>Pathology</subject><subject>Patients</subject><subject>Prognosis</subject><subject>Prostate - pathology</subject><subject>Prostate cancer</subject><subject>Prostatic Neoplasms - pathology</subject><subject>Regression analysis</subject><subject>Technology application</subject><subject>Tiles</subject><subject>Tumors</subject><issn>1932-6203</issn><issn>1932-6203</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNqNk01v1DAQhiMEoqXwDxBYQkJw2MWOE9s5oario1KlSlBxtWbjSdaVYy-xA5Qb_xynu626qAfkg7-eeccznimK54wuGZfs3WWYRg9uuQkel7SUiqrqQXHIGl4uREn5wzvrg-JJjJeU1lwJ8bg44KIpVcnpYfHnJAwbGG0MnoSOABmgXVuPBLwhBnFDHMLore_JEAw60oWRwJTCAAkNSdMw770PCZKdNTwxtrfJ_s63P9fBIYnOGiSbMcTMIGnBtziStY0puNBfPS0edeAiPtvNR8XFxw8XJ58XZ-efTk-OzxataHhacEVXpah4U0lRyg6llLyqQJbcGNWwCkukgkGnGMiVarmqZSeMQQG8prTkR8XLrezGhah3yYs6503WomG0ysTpljABLvVmtAOMVzqA1dcHYew1jMm2DvWqAUUpdq2pVWX4CoyQQPmqNVB1CF3Wer_zNq0GNC36NILbE92_8Xat-_BDM0pZI3LER8WbncIYvk8Ykx5sbNE58Bim64crRZlo6oy--ge9P7wd1UOOwPouZMftLKqPZcVqXks2U8t7qDwMDrbNpdbZfL5n8HbPIDMJf6Uephj16dcv_8-ef9tnX99h1wgurWNw01xlcR-stmCbKyyO2N1mmVE9d8pNNvTcKXrXKdnsxd0fujW6aQ3-F-l1EBU</recordid><startdate>20230316</startdate><enddate>20230316</enddate><creator>Duenweg, Savannah R</creator><creator>Brehler, Michael</creator><creator>Bobholz, Samuel A</creator><creator>Lowman, Allison K</creator><creator>Winiarz, Aleksandra</creator><creator>Kyereme, Fitzgerald</creator><creator>Nencka, Andrew</creator><creator>Iczkowski, Kenneth A</creator><creator>LaViolette, Peter S</creator><general>Public Library of Science</general><general>Public Library of Science (PLoS)</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>IOV</scope><scope>ISR</scope><scope>3V.</scope><scope>7QG</scope><scope>7QL</scope><scope>7QO</scope><scope>7RV</scope><scope>7SN</scope><scope>7SS</scope><scope>7T5</scope><scope>7TG</scope><scope>7TM</scope><scope>7U9</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8AO</scope><scope>8C1</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>C1K</scope><scope>CCPQU</scope><scope>D1I</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>H94</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KB.</scope><scope>KB0</scope><scope>KL.</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M7N</scope><scope>M7P</scope><scope>M7S</scope><scope>NAPCQ</scope><scope>P5Z</scope><scope>P62</scope><scope>P64</scope><scope>PATMY</scope><scope>PDBOC</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>RC3</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope><orcidid>https://orcid.org/0000-0003-4010-7737</orcidid><orcidid>https://orcid.org/0000-0002-9602-6891</orcidid></search><sort><creationdate>20230316</creationdate><title>Comparison of a machine and deep learning model for automated tumor annotation on digitized whole slide prostate cancer histology</title><author>Duenweg, Savannah R ; Brehler, Michael ; Bobholz, Samuel A ; Lowman, Allison K ; Winiarz, Aleksandra ; Kyereme, Fitzgerald ; Nencka, Andrew ; Iczkowski, Kenneth A ; LaViolette, Peter S</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c693t-380b2643947627fe777344a723dd8914e2e061af81a7b8c3857f6dde6a350023</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>Algorithms</topic><topic>Annotations</topic><topic>Artificial intelligence</topic><topic>Automation</topic><topic>Biology and Life Sciences</topic><topic>Cancer therapies</topic><topic>Comparative analysis</topic><topic>Computer and Information Sciences</topic><topic>Computer-aided medical diagnosis</topic><topic>Datasets</topic><topic>Deep Learning</topic><topic>Diagnosis</topic><topic>Digitization</topic><topic>Histology</topic><topic>Humans</topic><topic>Learning algorithms</topic><topic>Machine Learning</topic><topic>Male</topic><topic>Medical diagnosis</topic><topic>Medical prognosis</topic><topic>Medical research</topic><topic>Medical screening</topic><topic>Medicine and Health Sciences</topic><topic>Methods</topic><topic>Microscopy, Medical</topic><topic>Modelling</topic><topic>Neoplasm Grading</topic><topic>Neural networks</topic><topic>Pathology</topic><topic>Patients</topic><topic>Prognosis</topic><topic>Prostate - pathology</topic><topic>Prostate cancer</topic><topic>Prostatic Neoplasms - pathology</topic><topic>Regression analysis</topic><topic>Technology application</topic><topic>Tiles</topic><topic>Tumors</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Duenweg, Savannah R</creatorcontrib><creatorcontrib>Brehler, Michael</creatorcontrib><creatorcontrib>Bobholz, Samuel A</creatorcontrib><creatorcontrib>Lowman, Allison K</creatorcontrib><creatorcontrib>Winiarz, Aleksandra</creatorcontrib><creatorcontrib>Kyereme, Fitzgerald</creatorcontrib><creatorcontrib>Nencka, Andrew</creatorcontrib><creatorcontrib>Iczkowski, Kenneth A</creatorcontrib><creatorcontrib>LaViolette, Peter S</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>Gale In Context: Opposing Viewpoints</collection><collection>Science (Gale in Context)</collection><collection>ProQuest Central (Corporate)</collection><collection>Animal Behavior Abstracts</collection><collection>Bacteriology Abstracts (Microbiology B)</collection><collection>Biotechnology Research Abstracts</collection><collection>Nursing &amp; Allied Health Database</collection><collection>Ecology Abstracts</collection><collection>Entomology Abstracts (Full archive)</collection><collection>Immunology Abstracts</collection><collection>Meteorological &amp; Geoastrophysical Abstracts</collection><collection>Nucleic Acids Abstracts</collection><collection>Virology and AIDS Abstracts</collection><collection>Agricultural Science Collection</collection><collection>ProQuest Health and Medical</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Public Health Database</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>AUTh Library subscriptions: ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Environmental Sciences and Pollution Management</collection><collection>ProQuest One Community College</collection><collection>ProQuest Materials Science Collection</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>AIDS and Cancer Research Abstracts</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>https://resources.nclive.org/materials</collection><collection>Nursing &amp; Allied Health Database (Alumni Edition)</collection><collection>Meteorological &amp; Geoastrophysical Abstracts - Academic</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agriculture Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>PML(ProQuest Medical Library)</collection><collection>Algology Mycology and Protozoology Abstracts (Microbiology C)</collection><collection>ProQuest Biological Science Journals</collection><collection>Engineering Database</collection><collection>Nursing &amp; Allied Health Premium</collection><collection>ProQuest advanced technologies &amp; aerospace journals</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Environmental Science Database</collection><collection>Materials science collection</collection><collection>Access via ProQuest (Open Access)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering collection</collection><collection>Environmental Science Collection</collection><collection>Genetics Abstracts</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>Directory of Open Access Journals</collection><jtitle>PloS one</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Duenweg, Savannah R</au><au>Brehler, Michael</au><au>Bobholz, Samuel A</au><au>Lowman, Allison K</au><au>Winiarz, Aleksandra</au><au>Kyereme, Fitzgerald</au><au>Nencka, Andrew</au><au>Iczkowski, Kenneth A</au><au>LaViolette, Peter S</au><au>Fraiwan, Mohammad Amin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Comparison of a machine and deep learning model for automated tumor annotation on digitized whole slide prostate cancer histology</atitle><jtitle>PloS one</jtitle><addtitle>PLoS One</addtitle><date>2023-03-16</date><risdate>2023</risdate><volume>18</volume><issue>3</issue><spage>e0278084</spage><epage>e0278084</epage><pages>e0278084-e0278084</pages><issn>1932-6203</issn><eissn>1932-6203</eissn><abstract>One in eight men will be affected by prostate cancer (PCa) in their lives. While the current clinical standard prognostic marker for PCa is the Gleason score, it is subject to inter-reviewer variability. This study compares two machine learning methods for discriminating between cancerous regions on digitized histology from 47 PCa patients. Whole-slide images were annotated by a GU fellowship-trained pathologist for each Gleason pattern. High-resolution tiles were extracted from annotated and unlabeled tissue. Patients were separated into a training set of 31 patients (Cohort A, n = 9345 tiles) and a testing cohort of 16 patients (Cohort B, n = 4375 tiles). Tiles from Cohort A were used to train a ResNet model, and glands from these tiles were segmented to calculate pathomic features to train a bagged ensemble model to discriminate tumors as (1) cancer and noncancer, (2) high- and low-grade cancer from noncancer, and (3) all Gleason patterns. The outputs of these models were compared to ground-truth pathologist annotations. The ensemble and ResNet models had overall accuracies of 89% and 88%, respectively, at predicting cancer from noncancer. The ResNet model was additionally able to differentiate Gleason patterns on data from Cohort B while the ensemble model was not. Our results suggest that quantitative pathomic features calculated from PCa histology can distinguish regions of cancer; however, texture features captured by deep learning frameworks better differentiate unique Gleason patterns.</abstract><cop>United States</cop><pub>Public Library of Science</pub><pmid>36928230</pmid><doi>10.1371/journal.pone.0278084</doi><tpages>e0278084</tpages><orcidid>https://orcid.org/0000-0003-4010-7737</orcidid><orcidid>https://orcid.org/0000-0002-9602-6891</orcidid><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier ISSN: 1932-6203
ispartof PloS one, 2023-03, Vol.18 (3), p.e0278084-e0278084
issn 1932-6203
1932-6203
language eng
recordid cdi_plos_journals_2787569104
source Access via ProQuest (Open Access); PubMed Central
subjects Accuracy
Algorithms
Annotations
Artificial intelligence
Automation
Biology and Life Sciences
Cancer therapies
Comparative analysis
Computer and Information Sciences
Computer-aided medical diagnosis
Datasets
Deep Learning
Diagnosis
Digitization
Histology
Humans
Learning algorithms
Machine Learning
Male
Medical diagnosis
Medical prognosis
Medical research
Medical screening
Medicine and Health Sciences
Methods
Microscopy, Medical
Modelling
Neoplasm Grading
Neural networks
Pathology
Patients
Prognosis
Prostate - pathology
Prostate cancer
Prostatic Neoplasms - pathology
Regression analysis
Technology application
Tiles
Tumors
title Comparison of a machine and deep learning model for automated tumor annotation on digitized whole slide prostate cancer histology
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-01T07%3A42%3A25IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_plos_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Comparison%20of%20a%20machine%20and%20deep%20learning%20model%20for%20automated%20tumor%20annotation%20on%20digitized%20whole%20slide%20prostate%20cancer%20histology&rft.jtitle=PloS%20one&rft.au=Duenweg,%20Savannah%20R&rft.date=2023-03-16&rft.volume=18&rft.issue=3&rft.spage=e0278084&rft.epage=e0278084&rft.pages=e0278084-e0278084&rft.issn=1932-6203&rft.eissn=1932-6203&rft_id=info:doi/10.1371/journal.pone.0278084&rft_dat=%3Cgale_plos_%3EA741535714%3C/gale_plos_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c693t-380b2643947627fe777344a723dd8914e2e061af81a7b8c3857f6dde6a350023%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2787569104&rft_id=info:pmid/36928230&rft_galeid=A741535714&rfr_iscdi=true