Loading…
Boundary-aware residual network for defect detection in strip steel products
Strip steel is crucial in many industrial fields as an important industrial material. Automatic detection of strip steel surface defects can significantly improve production efficiency. However, defect detection technology continues to be one of the challenges to be solved in manufacturing. The sali...
Saved in:
Published in: | Evolving systems 2024-10, Vol.15 (5), p.1649-1663 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Strip steel is crucial in many industrial fields as an important industrial material. Automatic detection of strip steel surface defects can significantly improve production efficiency. However, defect detection technology continues to be one of the challenges to be solved in manufacturing. The saliency object detection (SOD) method provides an idea to solve this issue, which is still less than ideal for strip steel defect boundary localization. To address these challenges, a novel saliency detection method with a boundary-aware residual network (BARNet) is proposed in this study. A feature weighting module (FWM), a boundary-aware module (BAM), and a residual refinement module (RRM) are involved in the proposed network. In this method, multi-scale features are extracted by utilizing the fully convolutional network in the encoder stage. After that, the preliminary salient maps are acquired by the FWM. Next, the boundary features of the saliency maps are extracted and then recovered using the BAM. Finally, the optimized saliency maps are generated by using the RRM. To demonstrate the effectiveness of our BARNet, comprehensive experiments were conducted on the SD-saliency-900 dataset. The experimental results indicate that the present model can effectively extract the shape, contour, and structural features of defects while accurately locating them. |
---|---|
ISSN: | 1868-6478 1868-6486 |
DOI: | 10.1007/s12530-024-09588-3 |