Loading…

Polycl: Context-Aware Contrastive Learning for Image Segmentation

Medical image segmentation is one of the most important tasks in an imaging pipeline as it influences a number of image-guided decisions. To be effective, the standard fully-supervised segmentation approach requires a large amount of manually annotated training data. The expensive, time-consuming, a...

Full description

Saved in:
Bibliographic Details
Main Authors: Moseley, Aaron, Imran, Abdullah Al Zubaer
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Medical image segmentation is one of the most important tasks in an imaging pipeline as it influences a number of image-guided decisions. To be effective, the standard fully-supervised segmentation approach requires a large amount of manually annotated training data. The expensive, time-consuming, and error-prone pixel-level annotation process hinders progress and makes it challenging to perform effective segmentations. It is, therefore, imperative that the models learn as efficiently as possible from the limited available data. Such limited labeled image segmentation can be facilitated by self-supervised learning (SSL), particularly contrastive learning via pre-training on unlabeled data and fine-tuning on limited annotations. To this end, we propose a novel self-supervised contrastive learning framework for medical image segmentation leveraging inherent relationships of different images, dubbed as PolyCL. Without requiring any pixel-level annotations or data augmentations, our PolyCL learns and transfers context-aware discriminant features useful for segmentation from an innovative surrogate, in a task-related manner. Experimental evaluations on the public LiTS dataset demonstrate significantly superior performance of PolyCL over multiple baselines in segmenting liver from abdominal computed tomography (CT) images, achieving a Dice improvement of up to 5.5%.
ISSN:1945-8452
DOI:10.1109/ISBI56570.2024.10635698