Loading…
Automatic sewer pipe defect semantic segmentation based on improved U-Net
At present, technologies based on deep learning methods for automated detection of sewer defects have been developing rapidly. In this study, a novel semantic segmentation network called PipeUNet is proposed for sewer defect segmentation. In order to enhance the feature extraction capability and res...
Saved in:
Published in: | Automation in construction 2020-11, Vol.119, p.103383, Article 103383 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | At present, technologies based on deep learning methods for automated detection of sewer defects have been developing rapidly. In this study, a novel semantic segmentation network called PipeUNet is proposed for sewer defect segmentation. In order to enhance the feature extraction capability and resolve semantic differences between high level and low level features, a new module named feature reuse and attention mechanism block is added between the original skip connections of U-Net. Focal loss is adopted to solve the class imbalance problem. PipeUNet was trained using the CCTV images with typical defects including crack, infiltration, joint offset and intruding lateral. It was tested by the defects images and normal images to evaluate the network's defect segmentation and detection performance respectively. It achieved the highest Mean Intersection over Union of 76.37% which proved the proposed approach's efficiency. It can process CCTV images at a high speed of 32 images per second.
•A novel semantic segmentation network based on improved U-Net was proposed.•A new model named feature reuse and attention mechanism (FRAM) block is proposed.•To solve the imbalance problem, focal loss is used in the method.•The model has good results on crack, infiltration, joint offset and intruding lateral. |
---|---|
ISSN: | 0926-5805 1872-7891 |
DOI: | 10.1016/j.autcon.2020.103383 |