Loading…

A cross-scale mixed attention network for smoke segmentation

Deep neural networks have achieved good progresses in smoke segmentation, but it is a challenging task to accurately segment smoke images due to smoke semi-transparency and large variance. To improve performance, we propose a Cross-scale Mixed-attention Network (CMNet) by designing multi-scale and m...

Full description

Saved in:
Bibliographic Details
Published in:Digital signal processing 2023-04, Vol.134, p.103924, Article 103924
Main Authors: Yuan, Feiniu, Shi, Yu, Zhang, Lin, Fang, Yuming
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep neural networks have achieved good progresses in smoke segmentation, but it is a challenging task to accurately segment smoke images due to smoke semi-transparency and large variance. To improve performance, we propose a Cross-scale Mixed-attention Network (CMNet) by designing multi-scale and mixed attention modules. We first concatenate the results of average and maximum pooling along each axis to learn powerful attention coefficients, which are used to weight the original input for producing a directional attention map along each axis. Then we use point-wise additions for combining three directional attention maps to propose a module of Fused 3D Attention (F3A). In addition, we adopt atrous convolutions to generate multi-scale feature maps, then point-wisely add the results of average and maximum pooling on each scale feature map, and design a bottleneck structure to produce an effective attention map for each scale and reduce learnable parameters simultaneously. All attention feature maps with different scales are concatenated to obtain Multi-scale Channel Attention (MCA). Finally, we cross-wisely stack the modules of F3A and MCA on both shallow and deep feature maps to propose Mixed Cross Enhancement (MCE) for fully fusing information across scales. Experiments show that our method surpasses most existing methods.
ISSN:1051-2004
1095-4333
DOI:10.1016/j.dsp.2023.103924