Loading…

Cell Phenotype Classification Based on Joint of Texture Information and Multilayer Feature Extraction in DenseNet

Cell phenotype classification is a critical task in many medical applications, such as protein localization, gene effect identification, and cancer diagnosis in some types. Fluorescence imaging is the most efficient tool to analyze the biological characteristics of cells. So cell phenotype classific...

Full description

Saved in:
Bibliographic Details
Published in:Computational intelligence and neuroscience 2022-11, Vol.2022, p.6895833-12
Main Authors: Fekri-Ershad, Shervan, Al-Imari, Mustafa Jawad, Hamad, Mohammed Hayder, Alsaffar, Marwa Fadhil, Hassan, Fuad Ghazi, Hadi, Mazin Eidan, Mahdi, Karrar Salih
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Cell phenotype classification is a critical task in many medical applications, such as protein localization, gene effect identification, and cancer diagnosis in some types. Fluorescence imaging is the most efficient tool to analyze the biological characteristics of cells. So cell phenotype classification in fluorescence microscopy images has received increased attention from scientists in the last decade. The visible structures of cells are usually different in terms of shape, texture, relationship between intensities, etc. In this scope, most of the presented approaches use one type or joint of low-level and high-level features. In this paper, a new approach is proposed based on a combination of low-level and high-level features. An improved version of local quinary patterns is used to extract low-level texture features. Also, an innovative multilayer deep feature extraction method is performed to extract high-level features from DenseNet. In this respect, an output feature map of dense blocks is entered in a separate way to pooling and flatten layers, and finally, feature vectors are concatenated. The performance of the proposed approach is evaluated on the benchmark dataset 2D-HeLa in terms of accuracy. Also, the proposed approach is compared with state-of-the-art methods in terms of classification accuracy. Comparison of results demonstrates higher performance of the proposed approach in comparison with some efficient methods.
ISSN:1687-5265
1687-5273
DOI:10.1155/2022/6895833