Loading…

A Geometrically-Constrained Deep Network For Ct Image Segmentation

Incorporating prior knowledge into a segmentation process, whether it be geometrical constraints such as volume penalisation, (partial) convexity enforcement, or topological prescriptions to preserve the contextual relations between objects, proves to improve accuracy in medical image segmentation,...

Full description

Saved in:
Bibliographic Details
Main Authors: Lambert, Zoe, Guyader, Carole Le, Petitjean, Caroline
Format: Conference Proceeding
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Incorporating prior knowledge into a segmentation process, whether it be geometrical constraints such as volume penalisation, (partial) convexity enforcement, or topological prescriptions to preserve the contextual relations between objects, proves to improve accuracy in medical image segmentation, in particular when addressing the issue of weak boundary definition. Motivated by this observation, the proposed contribution aims to include geometrical constraints in the training of convolutional neural networks in the form of a penalty in the loss function. These geometrical constraints take several forms and encompass level curve alignment through the weighted total variation component, an area penalisation phrased as a hard constraint in the modelling, and an intensity homogeneity criterion based on a combination of the standard Dice loss with the piecewise constant Mumford-Shah model. The mathematical formulation yields a non-smooth non-convex optimisation problem, which rules out conventional smooth optimisation techniques and leads us to adopt a Lagrangian setting. The application falls within the scope of organ-at-risk segmentation in CT (Computed Tomography) images, in the context of radiotherapy planning. Experiments demonstrate that our method provides significant improvements over existing non-constrained approaches.
ISSN:1945-8452
DOI:10.1109/ISBI48211.2021.9434088