Loading…
Knowledge distillation- based lightweight network for power scenarios inspection
To address the problem of target detection networks in power scenarios requiring a large number of parameters and complex structures to ensure accuracy, a knowledge distillation method is proposed to transfer intermediate feature knowledge and output logic knowledge from the teacher model to the stu...
Saved in:
Main Authors: | , , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | To address the problem of target detection networks in power scenarios requiring a large number of parameters and complex structures to ensure accuracy, a knowledge distillation method is proposed to transfer intermediate feature knowledge and output logic knowledge from the teacher model to the student model. The distilled student model achieves high accuracy while maintaining the advantages of fewer parameters and faster inference speed. GFL-ResI8 is used as the student model and GFL-Resl0l as the teacher model. The distilled student model achieves an accuracy of 78.8% with 19.1M parameters, an inference speed of 24.9 img/s, and a computational cost of 155.2 GFlops. Experimental results show that our distillation method achieves the highest accuracy for the student model compared to other distillation methods. |
---|---|
ISSN: | 2688-0938 |
DOI: | 10.1109/CAC59555.2023.10451266 |