Loading…

MSFF-UNet: Image segmentation in colorectal glands using an encoder-decoder U-shaped architecture with multi-scale feature fusion

Glands are closely related to the diagnosis of tumors. In pathological images, segmentation of the colorectal gland is a prerequisite for quantitative diagnosis. Segmentation algorithms based on deep learning have been widely used in medical images. However, the existing segmentation method has feat...

Full description

Saved in:
Bibliographic Details
Published in:Multimedia tools and applications 2024-04, Vol.83 (14), p.42681-42701
Main Authors: Liu, Chengdao, Peng, Kexin, Peng, Ziyang, Zhang, Xingzhi
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Glands are closely related to the diagnosis of tumors. In pathological images, segmentation of the colorectal gland is a prerequisite for quantitative diagnosis. Segmentation algorithms based on deep learning have been widely used in medical images. However, the existing segmentation method has feature fusion only existing in adjacent layers, ignoring cross-layer fusion. And ignoring the combination of local and global information for graphics. To solve the above problems, we propose a multi-scale fusion model (MSFF-UNet) based on U-Net. We enhance the fusion of multi-scale information in the feature fusion module (FFM) and combine spatial attention to highlight the spatial structure of objects. In addition, we use the receptive field extension module (RFEM) to fuse local and global information, thereby reducing information loss and improving segmentation performance. We also propose a boundary loss function, which enables the network to pay more attention to the boundary information and make the segmentation results more accurate. Compared to the U-Net model, our network improved the DICE score by 1.95% and the MIOU score by 2.6%, effectively improving the accuracy of colorectal glandular segmentation.
ISSN:1573-7721
1380-7501
1573-7721
DOI:10.1007/s11042-023-17079-x