Loading…

Deep learning-based bias transfer for overcoming laboratory differences of microscopic images

The automated analysis of medical images is currently limited by technical and biological noise and bias. The same source tissue can be represented by vastly different images if the image acquisition or processing protocols vary. For an image analysis pipeline, it is crucial to compensate such biase...

Full description

Saved in:
Bibliographic Details
Published in:arXiv.org 2021-05
Main Authors: Ann-Katrin Thebille, Dietrich, Esther, Martin, Klaus, Gernhold, Lukas, Lennartz, Maximilian, Kuppe, Christoph, Kramann, Rafael, Huber, Tobias B, Sauter, Guido, Puelles, Victor G, Zimmermann, Marina, Bonn, Stefan
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The automated analysis of medical images is currently limited by technical and biological noise and bias. The same source tissue can be represented by vastly different images if the image acquisition or processing protocols vary. For an image analysis pipeline, it is crucial to compensate such biases to avoid misinterpretations. Here, we evaluate, compare, and improve existing generative model architectures to overcome domain shifts for immunofluorescence (IF) and Hematoxylin and Eosin (H&E) stained microscopy images. To determine the performance of the generative models, the original and transformed images were segmented or classified by deep neural networks that were trained only on images of the target bias. In the scope of our analysis, U-Net cycleGANs trained with an additional identity and an MS-SSIM-based loss and Fixed-Point GANs trained with an additional structure loss led to the best results for the IF and H&E stained samples, respectively. Adapting the bias of the samples significantly improved the pixel-level segmentation for human kidney glomeruli and podocytes and improved the classification accuracy for human prostate biopsies by up to 14%.
ISSN:2331-8422
DOI:10.48550/arxiv.2105.11765