Loading…
Automatic Pixel-Level Crack Detection on Dam Surface Using Deep Convolutional Network
Crack detection on dam surfaces is an important task for safe inspection of hydropower stations. More and more object detection methods based on deep learning are being applied to crack detection. However, most of the methods can only achieve the classification and rough location of cracks. Pixel-le...
Saved in:
Published in: | Sensors (Basel, Switzerland) Switzerland), 2020-04, Vol.20 (7), p.2069 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Crack detection on dam surfaces is an important task for safe inspection of hydropower stations. More and more object detection methods based on deep learning are being applied to crack detection. However, most of the methods can only achieve the classification and rough location of cracks. Pixel-level crack detection can provide more intuitive and accurate detection results for dam health assessment. To realize pixel-level crack detection, a method of crack detection on dam surface (CDDS) using deep convolution network is proposed. First, we use an unmanned aerial vehicle (UAV) to collect dam surface images along a predetermined trajectory. Second, raw images are cropped. Then crack regions are manually labelled on cropped images to create the crack dataset, and the architecture of CDDS network is designed. Finally, the CDDS network is trained, validated and tested using the crack dataset. To validate the performance of the CDDS network, the predicted results are compared with ResNet152-based, SegNet, UNet and fully convolutional network (FCN). In terms of crack segmentation, the recall, precision, F-measure and IoU are 80.45%, 80.31%, 79.16%, and 66.76%. The results on test dataset show that the CDDS network has better performance for crack detection of dam surfaces. |
---|---|
ISSN: | 1424-8220 1424-8220 |
DOI: | 10.3390/s20072069 |