Loading…
Auto-CSC: A Transfer Learning Based Automatic Cell Segmentation and Count Framework
Cell segmentation and counting play a very important role in the medical field. The diagnosis of many diseases relies heavily on the kind and number of cells in the blood. convolution neural network achieves encouraging results on image segmentation. However, this data-driven method requires a large...
Saved in:
Published in: | Cyborg and bionic systems 2022-01, Vol.2022, p.9842349 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Cell segmentation and counting play a very important role in the medical field. The diagnosis of many diseases relies heavily on the kind and number of cells in the blood. convolution neural network achieves encouraging results on image segmentation. However, this data-driven method requires a large number of annotations and can be a time-consuming and expensive process, prone to human error. In this paper, we present a novel frame to segment and count cells without too many manually annotated cell images. Before training, we generated the cell image labels on single-kind cell images using traditional algorithms. These images were then used to form the train set with the label. Different train sets composed of different kinds of cell images are presented to the segmentation model to update its parameters. Finally, the pretrained U-Net model is transferred to segment the mixed cell images using a small dataset of manually labeled mixed cell images. To better evaluate the effectiveness of the proposed method, we design and train a new automatic cell segmentation and count framework. The test results and analyses show that the segmentation and count performance of the framework trained by the proposed method equal the model trained by large amounts of annotated mixed cell images. |
---|---|
ISSN: | 2692-7632 2097-1087 2692-7632 |
DOI: | 10.34133/2022/9842349 |