Loading…
Light weight object detector based on composite attention residual network and boundary location loss
The object detector based on deep learning has received extensive attention, but the high computational cost has become an obstacle to its large-scale application. It is a great challenge for object detection to further reduce the hardware requirements on the premise of ensuring high detection accur...
Saved in:
Published in: | Neurocomputing (Amsterdam) 2022-07, Vol.494, p.132-147 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | The object detector based on deep learning has received extensive attention, but the high computational cost has become an obstacle to its large-scale application. It is a great challenge for object detection to further reduce the hardware requirements on the premise of ensuring high detection accuracy. We propose a one-stage lightweight object detector and a new regression loss. In this method, ResNet is improved and combined with attention mechanism to ensure the maximum integrity of feature information with fewer parameters; The multi-scale feature fusion network is improved to reduce the reasoning complexity of the structure. In addition, the bounding box regression loss is improved, and the specific position of the bounding box is adjusted by considering the balance of multiple factors in the regression process. The experimental results show that: 1) the combination of most detectors and improved loss can further improve the performance of detectors; 2) As a whole, our improved network and loss can give consideration to both speed and accuracy on Pascal VOC and COCO; 3) in the addition of other new training tricks such as DropBlock and Mosaic, we can achieve better overall performance on the coco test development set, 38.42 AP (average accuracy) at 40.3 FPS. |
---|---|
ISSN: | 0925-2312 1872-8286 |
DOI: | 10.1016/j.neucom.2022.04.090 |