Loading…
Towards Pedestrian Target Detection with Optimized Mask R-CNN
Aiming at the problem of low pedestrian target detection accuracy, we propose a detection algorithm based on optimized Mask R-CNN which uses the latest research results of deep learning to improve the accuracy and speed of detection results. Due to the influence of illumination, posture, background,...
Saved in:
Published in: | Complexity (New York, N.Y.) N.Y.), 2020, Vol.2020 (2020), p.1-8 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Aiming at the problem of low pedestrian target detection accuracy, we propose a detection algorithm based on optimized Mask R-CNN which uses the latest research results of deep learning to improve the accuracy and speed of detection results. Due to the influence of illumination, posture, background, and other factors on the human target in the natural scene image, the complexity of target information is high. SKNet is used to replace the part of the convolution module in the depth residual network model in order to extract features better so that the model can adaptively select the best convolution kernel during training. In addition, according to the statistical law, the length-width ratio of the anchor box is modified to make it more accord with the natural characteristics of the pedestrian target. Finally, a pedestrian target dataset is established by selecting suitable pedestrian images in the COCO dataset and expanded by adding noise and median filtering. The optimized algorithm is compared with the original algorithm and several other mainstream target detection algorithms on the dataset; the experimental results show that the detection accuracy and detection speed of the optimized algorithm are improved, and its detection accuracy is better than other mainstream target detection algorithms. |
---|---|
ISSN: | 1076-2787 1099-0526 |
DOI: | 10.1155/2020/6662603 |