Loading…

Deep Learning Used to Detect Gear Inspection

Automatic gear defect detection equipment is relatively expensive, so small and medium-sized enterprises cannot afford the cost of such equipment. Therefore, most companies still use manual inspection methods for gear defect detection. Manual inspection methods not only take a long time but also has...

Full description

Saved in:
Bibliographic Details
Main Authors: Jian, Jia-Xian, Wang, Chuin-Mu
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Automatic gear defect detection equipment is relatively expensive, so small and medium-sized enterprises cannot afford the cost of such equipment. Therefore, most companies still use manual inspection methods for gear defect detection. Manual inspection methods not only take a long time but also has uneven detection quality. This paper proposes to use AI technology to build a cheap and fast gear defect detection method. And this method is used to complete the detection of gear tooth profile defects, tooth pitch defects and central hole defects. The method proposed in this paper is divided into four steps. In the first step, the ResNet model [1] is used to classify whether the gear image is complete or not. In the second step, the YOLOv4 model [2] is used to find the rectangular area of the tooth shape and tooth pitch in the image and cut it out. The third step is to use the UNet model [3] to segment the tooth profile and pitch profile, and calculate the area occupied by the profile. Finally, whether the difference from the average area is too large is used as the basis for judging whether the gear is defective. In the experiment result, 186 gear images are used for detection, and the obtained accuracy is about 91%. This result in addition to verifying the feasibility of the proposed method, it is also found that the proposed method can quickly and accurately detect gear defects that are difficult to judge by human eyes.
ISSN:2693-8421
DOI:10.1109/SNPD54884.2022.10051817