Loading…

Whole slide cervical cancer classification via graph attention networks and contrastive learning

Cervical cancer is one of the most common cancers among women, which seriously threatens women’s health. Early screening can reduce the incidence rate and mortality. Thinprep cytologic test (TCT) is one of the important means of cytological screening, which has high sensitivity and specificity, and...

Full description

Saved in:
Bibliographic Details
Published in:Neurocomputing (Amsterdam) 2025-01, Vol.613, p.128787, Article 128787
Main Authors: Fei, Manman, Zhang, Xin, Chen, Dongdong, Song, Zhiyun, Wang, Qian, Zhang, Lichi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
cited_by
cites cdi_FETCH-LOGICAL-c185t-c0bd69e9317e190c6a7cad74e2e06d38beff225fdb75dcb85d4dc9c4fbf3ffcb3
container_end_page
container_issue
container_start_page 128787
container_title Neurocomputing (Amsterdam)
container_volume 613
creator Fei, Manman
Zhang, Xin
Chen, Dongdong
Song, Zhiyun
Wang, Qian
Zhang, Lichi
description Cervical cancer is one of the most common cancers among women, which seriously threatens women’s health. Early screening can reduce the incidence rate and mortality. Thinprep cytologic test (TCT) is one of the important means of cytological screening, which has high sensitivity and specificity, and has been widely used in the early screening of cervical cancer. The automatic diagnosis of whole slide images (WSIs) by computers can effectively improve the efficiency and accuracy of doctors’ diagnoses. However, current methods ignore the intrinsic relationships between cervical cells in WSIs and neglect contextual information from the surrounding suspicious areas, and therefore limit their robustness and generalizability. In this paper, we propose a novel two-stage method to implement the automatic diagnosis of WSIs, which constructs Graph Attention Networks (GAT) based on local and global fields respectively to capture their contextual information in a hierarchical manner. In the first stage, we extract representative patches from each WSI through suspicious cell detection, and then employ a Local GAT to classify cervical cells by capturing correlations between suspicious cells in image tiles. This classification process provides the confidence and feature vectors for each suspicious cell. In the second stage, we perform WSI classification using a Global GAT model. We construct graphs corresponding to top-Kg and bottom-Kg cells for each WSI based on results from Local GAT, and introduce a supervised contrastive learning strategy to enhance the discriminative power of the extracted features. Experimental results demonstrate that our proposed method outperforms conventional approaches and effectively showcases the benefits of supervised contrastive learning. Our source code and example data are available at https://github.com/feimanman/Whole-Slide-Cervical-Cancer-Classification.
doi_str_mv 10.1016/j.neucom.2024.128787
format article
fullrecord <record><control><sourceid>elsevier_cross</sourceid><recordid>TN_cdi_crossref_primary_10_1016_j_neucom_2024_128787</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><els_id>S0925231224015583</els_id><sourcerecordid>S0925231224015583</sourcerecordid><originalsourceid>FETCH-LOGICAL-c185t-c0bd69e9317e190c6a7cad74e2e06d38beff225fdb75dcb85d4dc9c4fbf3ffcb3</originalsourceid><addsrcrecordid>eNp9kM1OwzAQhH0AiVJ4Aw5-gQTb-b8goYo_qRIXEEfjrNetS2pXtgni7UkJZ06zGmlGOx8hV5zlnPH6epc7_AS_zwUTZc5F27TNCVmwTlSZKLg4I-cx7hjjDRfdgry_bf2ANA5WIwUMowU1UFBuuikMKkZrJitZ7-hoFd0EddhSlRK6X89h-vLhI1LlNAXvUlAx2RHpgCo46zYX5NSoIeLlny7J6_3dy-oxWz8_PK1u1xnwtkoZsF7XHXYFb5B3DGrVgNJNiQJZrYu2R2OEqIzum0pD31a61NBBaXpTGAN9sSTl3AvBxxjQyEOwexW-JWfySEbu5ExGHsnImcwUu5ljOP02WgwygsVpvbYBIUnt7f8FP3H0dN4</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Whole slide cervical cancer classification via graph attention networks and contrastive learning</title><source>ScienceDirect Freedom Collection</source><creator>Fei, Manman ; Zhang, Xin ; Chen, Dongdong ; Song, Zhiyun ; Wang, Qian ; Zhang, Lichi</creator><creatorcontrib>Fei, Manman ; Zhang, Xin ; Chen, Dongdong ; Song, Zhiyun ; Wang, Qian ; Zhang, Lichi</creatorcontrib><description>Cervical cancer is one of the most common cancers among women, which seriously threatens women’s health. Early screening can reduce the incidence rate and mortality. Thinprep cytologic test (TCT) is one of the important means of cytological screening, which has high sensitivity and specificity, and has been widely used in the early screening of cervical cancer. The automatic diagnosis of whole slide images (WSIs) by computers can effectively improve the efficiency and accuracy of doctors’ diagnoses. However, current methods ignore the intrinsic relationships between cervical cells in WSIs and neglect contextual information from the surrounding suspicious areas, and therefore limit their robustness and generalizability. In this paper, we propose a novel two-stage method to implement the automatic diagnosis of WSIs, which constructs Graph Attention Networks (GAT) based on local and global fields respectively to capture their contextual information in a hierarchical manner. In the first stage, we extract representative patches from each WSI through suspicious cell detection, and then employ a Local GAT to classify cervical cells by capturing correlations between suspicious cells in image tiles. This classification process provides the confidence and feature vectors for each suspicious cell. In the second stage, we perform WSI classification using a Global GAT model. We construct graphs corresponding to top-Kg and bottom-Kg cells for each WSI based on results from Local GAT, and introduce a supervised contrastive learning strategy to enhance the discriminative power of the extracted features. Experimental results demonstrate that our proposed method outperforms conventional approaches and effectively showcases the benefits of supervised contrastive learning. Our source code and example data are available at https://github.com/feimanman/Whole-Slide-Cervical-Cancer-Classification.</description><identifier>ISSN: 0925-2312</identifier><identifier>DOI: 10.1016/j.neucom.2024.128787</identifier><language>eng</language><publisher>Elsevier B.V</publisher><subject>Cervical cancer ; Graph attention network ; Supervised contrastive learning ; Whole slide image classification</subject><ispartof>Neurocomputing (Amsterdam), 2025-01, Vol.613, p.128787, Article 128787</ispartof><rights>2024 Elsevier B.V.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c185t-c0bd69e9317e190c6a7cad74e2e06d38beff225fdb75dcb85d4dc9c4fbf3ffcb3</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>314,780,784,27924,27925</link.rule.ids></links><search><creatorcontrib>Fei, Manman</creatorcontrib><creatorcontrib>Zhang, Xin</creatorcontrib><creatorcontrib>Chen, Dongdong</creatorcontrib><creatorcontrib>Song, Zhiyun</creatorcontrib><creatorcontrib>Wang, Qian</creatorcontrib><creatorcontrib>Zhang, Lichi</creatorcontrib><title>Whole slide cervical cancer classification via graph attention networks and contrastive learning</title><title>Neurocomputing (Amsterdam)</title><description>Cervical cancer is one of the most common cancers among women, which seriously threatens women’s health. Early screening can reduce the incidence rate and mortality. Thinprep cytologic test (TCT) is one of the important means of cytological screening, which has high sensitivity and specificity, and has been widely used in the early screening of cervical cancer. The automatic diagnosis of whole slide images (WSIs) by computers can effectively improve the efficiency and accuracy of doctors’ diagnoses. However, current methods ignore the intrinsic relationships between cervical cells in WSIs and neglect contextual information from the surrounding suspicious areas, and therefore limit their robustness and generalizability. In this paper, we propose a novel two-stage method to implement the automatic diagnosis of WSIs, which constructs Graph Attention Networks (GAT) based on local and global fields respectively to capture their contextual information in a hierarchical manner. In the first stage, we extract representative patches from each WSI through suspicious cell detection, and then employ a Local GAT to classify cervical cells by capturing correlations between suspicious cells in image tiles. This classification process provides the confidence and feature vectors for each suspicious cell. In the second stage, we perform WSI classification using a Global GAT model. We construct graphs corresponding to top-Kg and bottom-Kg cells for each WSI based on results from Local GAT, and introduce a supervised contrastive learning strategy to enhance the discriminative power of the extracted features. Experimental results demonstrate that our proposed method outperforms conventional approaches and effectively showcases the benefits of supervised contrastive learning. Our source code and example data are available at https://github.com/feimanman/Whole-Slide-Cervical-Cancer-Classification.</description><subject>Cervical cancer</subject><subject>Graph attention network</subject><subject>Supervised contrastive learning</subject><subject>Whole slide image classification</subject><issn>0925-2312</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2025</creationdate><recordtype>article</recordtype><recordid>eNp9kM1OwzAQhH0AiVJ4Aw5-gQTb-b8goYo_qRIXEEfjrNetS2pXtgni7UkJZ06zGmlGOx8hV5zlnPH6epc7_AS_zwUTZc5F27TNCVmwTlSZKLg4I-cx7hjjDRfdgry_bf2ANA5WIwUMowU1UFBuuikMKkZrJitZ7-hoFd0EddhSlRK6X89h-vLhI1LlNAXvUlAx2RHpgCo46zYX5NSoIeLlny7J6_3dy-oxWz8_PK1u1xnwtkoZsF7XHXYFb5B3DGrVgNJNiQJZrYu2R2OEqIzum0pD31a61NBBaXpTGAN9sSTl3AvBxxjQyEOwexW-JWfySEbu5ExGHsnImcwUu5ljOP02WgwygsVpvbYBIUnt7f8FP3H0dN4</recordid><startdate>20250114</startdate><enddate>20250114</enddate><creator>Fei, Manman</creator><creator>Zhang, Xin</creator><creator>Chen, Dongdong</creator><creator>Song, Zhiyun</creator><creator>Wang, Qian</creator><creator>Zhang, Lichi</creator><general>Elsevier B.V</general><scope>AAYXX</scope><scope>CITATION</scope></search><sort><creationdate>20250114</creationdate><title>Whole slide cervical cancer classification via graph attention networks and contrastive learning</title><author>Fei, Manman ; Zhang, Xin ; Chen, Dongdong ; Song, Zhiyun ; Wang, Qian ; Zhang, Lichi</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c185t-c0bd69e9317e190c6a7cad74e2e06d38beff225fdb75dcb85d4dc9c4fbf3ffcb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2025</creationdate><topic>Cervical cancer</topic><topic>Graph attention network</topic><topic>Supervised contrastive learning</topic><topic>Whole slide image classification</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Fei, Manman</creatorcontrib><creatorcontrib>Zhang, Xin</creatorcontrib><creatorcontrib>Chen, Dongdong</creatorcontrib><creatorcontrib>Song, Zhiyun</creatorcontrib><creatorcontrib>Wang, Qian</creatorcontrib><creatorcontrib>Zhang, Lichi</creatorcontrib><collection>CrossRef</collection><jtitle>Neurocomputing (Amsterdam)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Fei, Manman</au><au>Zhang, Xin</au><au>Chen, Dongdong</au><au>Song, Zhiyun</au><au>Wang, Qian</au><au>Zhang, Lichi</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Whole slide cervical cancer classification via graph attention networks and contrastive learning</atitle><jtitle>Neurocomputing (Amsterdam)</jtitle><date>2025-01-14</date><risdate>2025</risdate><volume>613</volume><spage>128787</spage><pages>128787-</pages><artnum>128787</artnum><issn>0925-2312</issn><abstract>Cervical cancer is one of the most common cancers among women, which seriously threatens women’s health. Early screening can reduce the incidence rate and mortality. Thinprep cytologic test (TCT) is one of the important means of cytological screening, which has high sensitivity and specificity, and has been widely used in the early screening of cervical cancer. The automatic diagnosis of whole slide images (WSIs) by computers can effectively improve the efficiency and accuracy of doctors’ diagnoses. However, current methods ignore the intrinsic relationships between cervical cells in WSIs and neglect contextual information from the surrounding suspicious areas, and therefore limit their robustness and generalizability. In this paper, we propose a novel two-stage method to implement the automatic diagnosis of WSIs, which constructs Graph Attention Networks (GAT) based on local and global fields respectively to capture their contextual information in a hierarchical manner. In the first stage, we extract representative patches from each WSI through suspicious cell detection, and then employ a Local GAT to classify cervical cells by capturing correlations between suspicious cells in image tiles. This classification process provides the confidence and feature vectors for each suspicious cell. In the second stage, we perform WSI classification using a Global GAT model. We construct graphs corresponding to top-Kg and bottom-Kg cells for each WSI based on results from Local GAT, and introduce a supervised contrastive learning strategy to enhance the discriminative power of the extracted features. Experimental results demonstrate that our proposed method outperforms conventional approaches and effectively showcases the benefits of supervised contrastive learning. Our source code and example data are available at https://github.com/feimanman/Whole-Slide-Cervical-Cancer-Classification.</abstract><pub>Elsevier B.V</pub><doi>10.1016/j.neucom.2024.128787</doi></addata></record>
fulltext fulltext
identifier ISSN: 0925-2312
ispartof Neurocomputing (Amsterdam), 2025-01, Vol.613, p.128787, Article 128787
issn 0925-2312
language eng
recordid cdi_crossref_primary_10_1016_j_neucom_2024_128787
source ScienceDirect Freedom Collection
subjects Cervical cancer
Graph attention network
Supervised contrastive learning
Whole slide image classification
title Whole slide cervical cancer classification via graph attention networks and contrastive learning
url http://sfxeu10.hosted.exlibrisgroup.com/loughborough?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-25T04%3A32%3A12IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-elsevier_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Whole%20slide%20cervical%20cancer%20classification%20via%20graph%20attention%20networks%20and%20contrastive%20learning&rft.jtitle=Neurocomputing%20(Amsterdam)&rft.au=Fei,%20Manman&rft.date=2025-01-14&rft.volume=613&rft.spage=128787&rft.pages=128787-&rft.artnum=128787&rft.issn=0925-2312&rft_id=info:doi/10.1016/j.neucom.2024.128787&rft_dat=%3Celsevier_cross%3ES0925231224015583%3C/elsevier_cross%3E%3Cgrp_id%3Ecdi_FETCH-LOGICAL-c185t-c0bd69e9317e190c6a7cad74e2e06d38beff225fdb75dcb85d4dc9c4fbf3ffcb3%3C/grp_id%3E%3Coa%3E%3C/oa%3E%3Curl%3E%3C/url%3E&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true