Loading…
Attention-Gate-based U-shaped Reconstruction Network (AGUR-Net) for color-patterned fabric defect detection
Color-patterned fabrics possess changeable patterns, low probability of defective samples, and various forms of defects. Therefore, the unsupervised inspection of color-patterned fabrics has gradually become a research hotspot in the field of fabric defect detection. However, due to the redundant in...
Saved in:
Published in: | Textile research journal 2023-08, Vol.93 (15-16), p.3459-3477 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Color-patterned fabrics possess changeable patterns, low probability of defective samples, and various forms of defects. Therefore, the unsupervised inspection of color-patterned fabrics has gradually become a research hotspot in the field of fabric defect detection. However, due to the redundant information of skip connections in the network and the limitation of post-processing, the current reconstruction-based unsupervised fabric defect detection methods have difficulty in detecting some defects of color-patterned fabrics. In this article, we propose an Attention-Gate-based U-shaped Reconstruction Network (AGUR-Net) and a dual-threshold segmentation post-processing method. AGUR-Net consists of an encoder, an Atrous Spatial Pyramid Pooling module and an attention gate weighted fusion residual decoder. The encoder is used to obtain more representative features of the input image via EfficientNet-B2. The Atrous Spatial Pyramid Pooling module is used to enlarge the receptive field of the network and introduce multi-scale information into the decoder. The attention-gate-weighted residual fusion decoder is used to fuse the features of the encoder with the features of the decoder to obtain the reconstructed image. The dual-threshold segmentation post-processing is used to obtain the final defect detection results. Our method achieves a precision of 59.38%, a recall of 59.1%, an F1 of 54.31%, and an intersection-over-union ratio of 41.18% on the public dataset YDFID-1. The experimental results show that the proposed method can better detect and locate the defects of color-patterned fabrics compared with several other state-of-the-art unsupervised fabric defect detection methods. |
---|---|
ISSN: | 0040-5175 1746-7748 |
DOI: | 10.1177/00405175221149450 |