Loading…
Deploying Patch-Based Segmentation Pipeline for Fibroblast Cell Images at Varying Magnifications
Cell culture monitoring necessitates thorough attention for the continuous characterization of cultivated cells. Machine learning has recently emerged to engage in a process, such as a microscopy segmentation task; however, the trained data may not be comprehensive for other datasets. Most algorithm...
Saved in:
Published in: | IEEE access 2023, Vol.11, p.98171-98181 |
---|---|
Main Authors: | , , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Cell culture monitoring necessitates thorough attention for the continuous characterization of cultivated cells. Machine learning has recently emerged to engage in a process, such as a microscopy segmentation task; however, the trained data may not be comprehensive for other datasets. Most algorithms do not encompass a wide range of data attributes and require distinct system workflows. Thus, the main objective of the research is to propose a segmentation pipeline specifically for fibroblast cell images on phase contrast microscopy at different magnifications and to achieve reliable predictions during deployment. The research employs patch-based segmentation for predictions, with U-Net as the baseline architecture. The proposed segmentation pipeline demonstrated significant performance for the UNet-based network, achieving an IoU score above 0.7 for multiple magnifications, and provided predictions for cell confluency value with less than 3% error. The study also found that the proposed model could segment the fibroblast cells in under 10 seconds with the help of OpenVINO and Intel Compute Stick 2 on Raspberry Pi, with its optimal precision limited to approximately 80% cell confluency which is sufficient for real-world deployment as the cell culture is typically ready for passaging at the threshold. |
---|---|
ISSN: | 2169-3536 2169-3536 |
DOI: | 10.1109/ACCESS.2023.3312232 |