Loading…
Research on obstacle detection and path planning based on visual navigation for mobile robot
Both obstacle and path planning are of significance for autonomous visual navigation of a mobile robot which is able to realize free movement and obstacle avoidance. In this paper, a scheme of obstacle detection which is based on binocular vision to identify obstacles is constructed and it is combin...
Saved in:
Published in: | Journal of physics. Conference series 2020-08, Vol.1601 (6), p.62044 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Both obstacle and path planning are of significance for autonomous visual navigation of a mobile robot which is able to realize free movement and obstacle avoidance. In this paper, a scheme of obstacle detection which is based on binocular vision to identify obstacles is constructed and it is combined with improved A* algorithm for obstacle avoidance motion for mobile robot platform. First, a binocular vision system to detect the robot's perception of the environment is used and the visual information is obtained. Then a Gaussian filter is used to reduce the noise in the image, and the grayscale image is obtained through the binocular vision system. The gray scale map is matched with SAD (Sum of Absolute Differences) in order to obtain the disparity image with comparison. Second, the location information of the obstacle is extracted through the environmental depth information contained in the disparity image, and the improved A* algorithm is used as the path planning algorithm in this paper, which can modify the path of the planning error. Furthermore, the detection experiment of the robot visual obstacle avoidance system was established and the optimal path was obtained. The examples show that, the obstacle avoidance system can complete the task of obstacle detection and give a better path for the mobile robot. |
---|---|
ISSN: | 1742-6588 1742-6596 |
DOI: | 10.1088/1742-6596/1601/6/062044 |