Loading…
Biased data, biased AI: deep networks predict the acquisition site of TCGA images
Deep learning models applied to healthcare applications including digital pathology have been increasing their scope and importance in recent years. Many of these models have been trained on The Cancer Genome Atlas (TCGA) atlas of digital images, or use it as a validation source. One crucial factor...
Saved in:
Published in: | Diagnostic pathology 2023-05, Vol.18 (1), p.67-67, Article 67 |
---|---|
Main Authors: | , , , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
cited_by | cdi_FETCH-LOGICAL-c564t-a5c6778306f974a7b75f02a8eb6c5ea8d465cdb3e936cae45773df25c63188853 |
---|---|
cites | cdi_FETCH-LOGICAL-c564t-a5c6778306f974a7b75f02a8eb6c5ea8d465cdb3e936cae45773df25c63188853 |
container_end_page | 67 |
container_issue | 1 |
container_start_page | 67 |
container_title | Diagnostic pathology |
container_volume | 18 |
creator | Dehkharghanian, Taher Bidgoli, Azam Asilian Riasatian, Abtin Mazaheri, Pooria Campbell, Clinton J V Pantanowitz, Liron Tizhoosh, H R Rahnamayan, Shahryar |
description | Deep learning models applied to healthcare applications including digital pathology have been increasing their scope and importance in recent years. Many of these models have been trained on The Cancer Genome Atlas (TCGA) atlas of digital images, or use it as a validation source. One crucial factor that seems to have been widely ignored is the internal bias that originates from the institutions that contributed WSIs to the TCGA dataset, and its effects on models trained on this dataset.
8,579 paraffin-embedded, hematoxylin and eosin stained, digital slides were selected from the TCGA dataset. More than 140 medical institutions (acquisition sites) contributed to this dataset. Two deep neural networks (DenseNet121 and KimiaNet were used to extract deep features at 20× magnification. DenseNet was pre-trained on non-medical objects. KimiaNet has the same structure but trained for cancer type classification on TCGA images. The extracted deep features were later used to detect each slide's acquisition site, and also for slide representation in image search.
DenseNet's deep features could distinguish acquisition sites with 70% accuracy whereas KimiaNet's deep features could reveal acquisition sites with more than 86% accuracy. These findings suggest that there are acquisition site specific patterns that could be picked up by deep neural networks. It has also been shown that these medically irrelevant patterns can interfere with other applications of deep learning in digital pathology, namely image search. This study shows that there are acquisition site specific patterns that can be used to identify tissue acquisition sites without any explicit training. Furthermore, it was observed that a model trained for cancer subtype classification has exploited such medically irrelevant patterns to classify cancer types. Digital scanner configuration and noise, tissue stain variation and artifacts, and source site patient demographics are among factors that likely account for the observed bias. Therefore, researchers should be cautious of such bias when using histopathology datasets for developing and training deep networks. |
doi_str_mv | 10.1186/s13000-023-01355-3 |
format | article |
fullrecord | <record><control><sourceid>gale_doaj_</sourceid><recordid>TN_cdi_doaj_primary_oai_doaj_org_article_fe9d88279a934d69a5923043a2a8caeb</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><galeid>A749699196</galeid><doaj_id>oai_doaj_org_article_fe9d88279a934d69a5923043a2a8caeb</doaj_id><sourcerecordid>A749699196</sourcerecordid><originalsourceid>FETCH-LOGICAL-c564t-a5c6778306f974a7b75f02a8eb6c5ea8d465cdb3e936cae45773df25c63188853</originalsourceid><addsrcrecordid>eNptkktvEzEUhUcIREvhD7BAltiwYIrfDzZViKBEqoSQytry2HdSh2Sc2hMQ_x4nU0qDkBe2r8_5rm2dpnlJ8DkhWr4rhGGMW0xZiwkTomWPmlOiuGyJMPLxg_VJ86yUFcZcCIqfNidMEaOlIafN1w_RFQgouNG9Rd20mS3eowCwRQOMP1P-XtA2Q4h-ROMNIOdvd7HEMaYB1QlQ6tH1_HKG4sYtoTxvnvRuXeDF3XzWfPv08Xr-ub36crmYz65aLyQfWye8VEozLHujuFOdEj2mTkMnvQCnA5fCh46BYdI74EIpFnpaXYxorQU7axYTNyS3sttcu-dfNrloD4WUl9blMfo12B5M0Joq4wzjQRonDGWYM1f7VXZXWRcTa7vrNhA8DGN26yPo8ckQb-wy_bAEE20M5ZXw5o6Q0-0Oymg3sXhYr90AaVcs1URQLhXeX_z1P9JV2uWh_tVBJRlnmv5VLV19QRz6VBv7PdTOFDfSGGJkVZ3_R1VHgE30aYA-1vqRgU4Gn1MpGfr7RxJs96myU6psTZU9pMqyanr18HvuLX9ixH4D0OXEyA</addsrcrecordid><sourcetype>Open Website</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2815634382</pqid></control><display><type>article</type><title>Biased data, biased AI: deep networks predict the acquisition site of TCGA images</title><source>PubMed Central (Open Access)</source><source>Publicly Available Content (ProQuest)</source><source>Coronavirus Research Database</source><creator>Dehkharghanian, Taher ; Bidgoli, Azam Asilian ; Riasatian, Abtin ; Mazaheri, Pooria ; Campbell, Clinton J V ; Pantanowitz, Liron ; Tizhoosh, H R ; Rahnamayan, Shahryar</creator><creatorcontrib>Dehkharghanian, Taher ; Bidgoli, Azam Asilian ; Riasatian, Abtin ; Mazaheri, Pooria ; Campbell, Clinton J V ; Pantanowitz, Liron ; Tizhoosh, H R ; Rahnamayan, Shahryar</creatorcontrib><description>Deep learning models applied to healthcare applications including digital pathology have been increasing their scope and importance in recent years. Many of these models have been trained on The Cancer Genome Atlas (TCGA) atlas of digital images, or use it as a validation source. One crucial factor that seems to have been widely ignored is the internal bias that originates from the institutions that contributed WSIs to the TCGA dataset, and its effects on models trained on this dataset.
8,579 paraffin-embedded, hematoxylin and eosin stained, digital slides were selected from the TCGA dataset. More than 140 medical institutions (acquisition sites) contributed to this dataset. Two deep neural networks (DenseNet121 and KimiaNet were used to extract deep features at 20× magnification. DenseNet was pre-trained on non-medical objects. KimiaNet has the same structure but trained for cancer type classification on TCGA images. The extracted deep features were later used to detect each slide's acquisition site, and also for slide representation in image search.
DenseNet's deep features could distinguish acquisition sites with 70% accuracy whereas KimiaNet's deep features could reveal acquisition sites with more than 86% accuracy. These findings suggest that there are acquisition site specific patterns that could be picked up by deep neural networks. It has also been shown that these medically irrelevant patterns can interfere with other applications of deep learning in digital pathology, namely image search. This study shows that there are acquisition site specific patterns that can be used to identify tissue acquisition sites without any explicit training. Furthermore, it was observed that a model trained for cancer subtype classification has exploited such medically irrelevant patterns to classify cancer types. Digital scanner configuration and noise, tissue stain variation and artifacts, and source site patient demographics are among factors that likely account for the observed bias. Therefore, researchers should be cautious of such bias when using histopathology datasets for developing and training deep networks.</description><identifier>ISSN: 1746-1596</identifier><identifier>EISSN: 1746-1596</identifier><identifier>DOI: 10.1186/s13000-023-01355-3</identifier><identifier>PMID: 37198691</identifier><language>eng</language><publisher>England: BioMed Central Ltd</publisher><subject>Accuracy ; AI bias ; AI ethics ; Artificial neural networks ; Bias ; Cancer ; Classification ; Codes ; Coloring Agents ; Datasets ; Deep Learning ; Digital imaging ; Digital pathology ; Eosine Yellowish-(YS) ; Genomes ; Health care facilities ; Hematoxylin ; Histopathology ; Humans ; Image acquisition ; Image classification ; Machine learning ; Medical imaging ; Medical prognosis ; Medical research ; Medicine, Experimental ; Metadata ; Morphology ; Neoplasms - genetics ; Neural networks ; Neural Networks, Computer ; Paraffin ; Pathology ; TCGA ; Training</subject><ispartof>Diagnostic pathology, 2023-05, Vol.18 (1), p.67-67, Article 67</ispartof><rights>2023. The Author(s).</rights><rights>COPYRIGHT 2023 BioMed Central Ltd.</rights><rights>2023. This work is licensed under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>The Author(s) 2023</rights><lds50>peer_reviewed</lds50><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c564t-a5c6778306f974a7b75f02a8eb6c5ea8d465cdb3e936cae45773df25c63188853</citedby><cites>FETCH-LOGICAL-c564t-a5c6778306f974a7b75f02a8eb6c5ea8d465cdb3e936cae45773df25c63188853</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://www.ncbi.nlm.nih.gov/pmc/articles/PMC10189924/pdf/$$EPDF$$P50$$Gpubmedcentral$$Hfree_for_read</linktopdf><linktohtml>$$Uhttps://www.proquest.com/docview/2815634382?pq-origsite=primo$$EHTML$$P50$$Gproquest$$Hfree_for_read</linktohtml><link.rule.ids>230,314,727,780,784,885,25753,27924,27925,37012,37013,38516,43895,44590,53791,53793</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/37198691$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Dehkharghanian, Taher</creatorcontrib><creatorcontrib>Bidgoli, Azam Asilian</creatorcontrib><creatorcontrib>Riasatian, Abtin</creatorcontrib><creatorcontrib>Mazaheri, Pooria</creatorcontrib><creatorcontrib>Campbell, Clinton J V</creatorcontrib><creatorcontrib>Pantanowitz, Liron</creatorcontrib><creatorcontrib>Tizhoosh, H R</creatorcontrib><creatorcontrib>Rahnamayan, Shahryar</creatorcontrib><title>Biased data, biased AI: deep networks predict the acquisition site of TCGA images</title><title>Diagnostic pathology</title><addtitle>Diagn Pathol</addtitle><description>Deep learning models applied to healthcare applications including digital pathology have been increasing their scope and importance in recent years. Many of these models have been trained on The Cancer Genome Atlas (TCGA) atlas of digital images, or use it as a validation source. One crucial factor that seems to have been widely ignored is the internal bias that originates from the institutions that contributed WSIs to the TCGA dataset, and its effects on models trained on this dataset.
8,579 paraffin-embedded, hematoxylin and eosin stained, digital slides were selected from the TCGA dataset. More than 140 medical institutions (acquisition sites) contributed to this dataset. Two deep neural networks (DenseNet121 and KimiaNet were used to extract deep features at 20× magnification. DenseNet was pre-trained on non-medical objects. KimiaNet has the same structure but trained for cancer type classification on TCGA images. The extracted deep features were later used to detect each slide's acquisition site, and also for slide representation in image search.
DenseNet's deep features could distinguish acquisition sites with 70% accuracy whereas KimiaNet's deep features could reveal acquisition sites with more than 86% accuracy. These findings suggest that there are acquisition site specific patterns that could be picked up by deep neural networks. It has also been shown that these medically irrelevant patterns can interfere with other applications of deep learning in digital pathology, namely image search. This study shows that there are acquisition site specific patterns that can be used to identify tissue acquisition sites without any explicit training. Furthermore, it was observed that a model trained for cancer subtype classification has exploited such medically irrelevant patterns to classify cancer types. Digital scanner configuration and noise, tissue stain variation and artifacts, and source site patient demographics are among factors that likely account for the observed bias. Therefore, researchers should be cautious of such bias when using histopathology datasets for developing and training deep networks.</description><subject>Accuracy</subject><subject>AI bias</subject><subject>AI ethics</subject><subject>Artificial neural networks</subject><subject>Bias</subject><subject>Cancer</subject><subject>Classification</subject><subject>Codes</subject><subject>Coloring Agents</subject><subject>Datasets</subject><subject>Deep Learning</subject><subject>Digital imaging</subject><subject>Digital pathology</subject><subject>Eosine Yellowish-(YS)</subject><subject>Genomes</subject><subject>Health care facilities</subject><subject>Hematoxylin</subject><subject>Histopathology</subject><subject>Humans</subject><subject>Image acquisition</subject><subject>Image classification</subject><subject>Machine learning</subject><subject>Medical imaging</subject><subject>Medical prognosis</subject><subject>Medical research</subject><subject>Medicine, Experimental</subject><subject>Metadata</subject><subject>Morphology</subject><subject>Neoplasms - genetics</subject><subject>Neural networks</subject><subject>Neural Networks, Computer</subject><subject>Paraffin</subject><subject>Pathology</subject><subject>TCGA</subject><subject>Training</subject><issn>1746-1596</issn><issn>1746-1596</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>COVID</sourceid><sourceid>PIMPY</sourceid><sourceid>DOA</sourceid><recordid>eNptkktvEzEUhUcIREvhD7BAltiwYIrfDzZViKBEqoSQytry2HdSh2Sc2hMQ_x4nU0qDkBe2r8_5rm2dpnlJ8DkhWr4rhGGMW0xZiwkTomWPmlOiuGyJMPLxg_VJ86yUFcZcCIqfNidMEaOlIafN1w_RFQgouNG9Rd20mS3eowCwRQOMP1P-XtA2Q4h-ROMNIOdvd7HEMaYB1QlQ6tH1_HKG4sYtoTxvnvRuXeDF3XzWfPv08Xr-ub36crmYz65aLyQfWye8VEozLHujuFOdEj2mTkMnvQCnA5fCh46BYdI74EIpFnpaXYxorQU7axYTNyS3sttcu-dfNrloD4WUl9blMfo12B5M0Joq4wzjQRonDGWYM1f7VXZXWRcTa7vrNhA8DGN26yPo8ckQb-wy_bAEE20M5ZXw5o6Q0-0Oymg3sXhYr90AaVcs1URQLhXeX_z1P9JV2uWh_tVBJRlnmv5VLV19QRz6VBv7PdTOFDfSGGJkVZ3_R1VHgE30aYA-1vqRgU4Gn1MpGfr7RxJs96myU6psTZU9pMqyanr18HvuLX9ixH4D0OXEyA</recordid><startdate>20230517</startdate><enddate>20230517</enddate><creator>Dehkharghanian, Taher</creator><creator>Bidgoli, Azam Asilian</creator><creator>Riasatian, Abtin</creator><creator>Mazaheri, Pooria</creator><creator>Campbell, Clinton J V</creator><creator>Pantanowitz, Liron</creator><creator>Tizhoosh, H R</creator><creator>Rahnamayan, Shahryar</creator><general>BioMed Central Ltd</general><general>BioMed Central</general><general>BMC</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>8FD</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>CCPQU</scope><scope>COVID</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>K9.</scope><scope>M0S</scope><scope>M1P</scope><scope>M7Z</scope><scope>P64</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>7X8</scope><scope>5PM</scope><scope>DOA</scope></search><sort><creationdate>20230517</creationdate><title>Biased data, biased AI: deep networks predict the acquisition site of TCGA images</title><author>Dehkharghanian, Taher ; Bidgoli, Azam Asilian ; Riasatian, Abtin ; Mazaheri, Pooria ; Campbell, Clinton J V ; Pantanowitz, Liron ; Tizhoosh, H R ; Rahnamayan, Shahryar</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c564t-a5c6778306f974a7b75f02a8eb6c5ea8d465cdb3e936cae45773df25c63188853</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Accuracy</topic><topic>AI bias</topic><topic>AI ethics</topic><topic>Artificial neural networks</topic><topic>Bias</topic><topic>Cancer</topic><topic>Classification</topic><topic>Codes</topic><topic>Coloring Agents</topic><topic>Datasets</topic><topic>Deep Learning</topic><topic>Digital imaging</topic><topic>Digital pathology</topic><topic>Eosine Yellowish-(YS)</topic><topic>Genomes</topic><topic>Health care facilities</topic><topic>Hematoxylin</topic><topic>Histopathology</topic><topic>Humans</topic><topic>Image acquisition</topic><topic>Image classification</topic><topic>Machine learning</topic><topic>Medical imaging</topic><topic>Medical prognosis</topic><topic>Medical research</topic><topic>Medicine, Experimental</topic><topic>Metadata</topic><topic>Morphology</topic><topic>Neoplasms - genetics</topic><topic>Neural networks</topic><topic>Neural Networks, Computer</topic><topic>Paraffin</topic><topic>Pathology</topic><topic>TCGA</topic><topic>Training</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Dehkharghanian, Taher</creatorcontrib><creatorcontrib>Bidgoli, Azam Asilian</creatorcontrib><creatorcontrib>Riasatian, Abtin</creatorcontrib><creatorcontrib>Mazaheri, Pooria</creatorcontrib><creatorcontrib>Campbell, Clinton J V</creatorcontrib><creatorcontrib>Pantanowitz, Liron</creatorcontrib><creatorcontrib>Tizhoosh, H R</creatorcontrib><creatorcontrib>Rahnamayan, Shahryar</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Health & Medical Collection (Proquest)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ProQuest Central (Alumni)</collection><collection>ProQuest Central</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>ProQuest One Community College</collection><collection>Coronavirus Research Database</collection><collection>ProQuest Central</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Health & Medical Complete (Alumni)</collection><collection>Health & Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Biochemistry Abstracts 1</collection><collection>Biotechnology and BioEngineering Abstracts</collection><collection>Publicly Available Content (ProQuest)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>MEDLINE - Academic</collection><collection>PubMed Central (Full Participant titles)</collection><collection>DOAJ Directory of Open Access Journals</collection><jtitle>Diagnostic pathology</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dehkharghanian, Taher</au><au>Bidgoli, Azam Asilian</au><au>Riasatian, Abtin</au><au>Mazaheri, Pooria</au><au>Campbell, Clinton J V</au><au>Pantanowitz, Liron</au><au>Tizhoosh, H R</au><au>Rahnamayan, Shahryar</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Biased data, biased AI: deep networks predict the acquisition site of TCGA images</atitle><jtitle>Diagnostic pathology</jtitle><addtitle>Diagn Pathol</addtitle><date>2023-05-17</date><risdate>2023</risdate><volume>18</volume><issue>1</issue><spage>67</spage><epage>67</epage><pages>67-67</pages><artnum>67</artnum><issn>1746-1596</issn><eissn>1746-1596</eissn><abstract>Deep learning models applied to healthcare applications including digital pathology have been increasing their scope and importance in recent years. Many of these models have been trained on The Cancer Genome Atlas (TCGA) atlas of digital images, or use it as a validation source. One crucial factor that seems to have been widely ignored is the internal bias that originates from the institutions that contributed WSIs to the TCGA dataset, and its effects on models trained on this dataset.
8,579 paraffin-embedded, hematoxylin and eosin stained, digital slides were selected from the TCGA dataset. More than 140 medical institutions (acquisition sites) contributed to this dataset. Two deep neural networks (DenseNet121 and KimiaNet were used to extract deep features at 20× magnification. DenseNet was pre-trained on non-medical objects. KimiaNet has the same structure but trained for cancer type classification on TCGA images. The extracted deep features were later used to detect each slide's acquisition site, and also for slide representation in image search.
DenseNet's deep features could distinguish acquisition sites with 70% accuracy whereas KimiaNet's deep features could reveal acquisition sites with more than 86% accuracy. These findings suggest that there are acquisition site specific patterns that could be picked up by deep neural networks. It has also been shown that these medically irrelevant patterns can interfere with other applications of deep learning in digital pathology, namely image search. This study shows that there are acquisition site specific patterns that can be used to identify tissue acquisition sites without any explicit training. Furthermore, it was observed that a model trained for cancer subtype classification has exploited such medically irrelevant patterns to classify cancer types. Digital scanner configuration and noise, tissue stain variation and artifacts, and source site patient demographics are among factors that likely account for the observed bias. Therefore, researchers should be cautious of such bias when using histopathology datasets for developing and training deep networks.</abstract><cop>England</cop><pub>BioMed Central Ltd</pub><pmid>37198691</pmid><doi>10.1186/s13000-023-01355-3</doi><tpages>1</tpages><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | ISSN: 1746-1596 |
ispartof | Diagnostic pathology, 2023-05, Vol.18 (1), p.67-67, Article 67 |
issn | 1746-1596 1746-1596 |
language | eng |
recordid | cdi_doaj_primary_oai_doaj_org_article_fe9d88279a934d69a5923043a2a8caeb |
source | PubMed Central (Open Access); Publicly Available Content (ProQuest); Coronavirus Research Database |
subjects | Accuracy AI bias AI ethics Artificial neural networks Bias Cancer Classification Codes Coloring Agents Datasets Deep Learning Digital imaging Digital pathology Eosine Yellowish-(YS) Genomes Health care facilities Hematoxylin Histopathology Humans Image acquisition Image classification Machine learning Medical imaging Medical prognosis Medical research Medicine, Experimental Metadata Morphology Neoplasms - genetics Neural networks Neural Networks, Computer Paraffin Pathology TCGA Training |
title | Biased data, biased AI: deep networks predict the acquisition site of TCGA images |
url | http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-06T19%3A12%3A18IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-gale_doaj_&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Biased%20data,%20biased%20AI:%20deep%20networks%20predict%20the%20acquisition%20site%20of%20TCGA%20images&rft.jtitle=Diagnostic%20pathology&rft.au=Dehkharghanian,%20Taher&rft.date=2023-05-17&rft.volume=18&rft.issue=1&rft.spage=67&rft.epage=67&rft.pages=67-67&rft.artnum=67&rft.issn=1746-1596&rft.eissn=1746-1596&rft_id=info:doi/10.1186/s13000-023-01355-3&rft_dat=%3Cgale_doaj_%3EA749699196%3C/gale_doaj_%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c564t-a5c6778306f974a7b75f02a8eb6c5ea8d465cdb3e936cae45773df25c63188853%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_pqid=2815634382&rft_id=info:pmid/37198691&rft_galeid=A749699196&rfr_iscdi=true |