Improving Knowledge Distillation via Head and Tail Categories

Knowledge distillation (KD) is a technique that transfers "dark knowledge" from a deep teacher network (teacher) to a shallow student network (student). Despite significant advances in KD, existing work has not adequately mined two crucial types of knowledge: 1) the knowledge of head categ...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on circuits and systems for video technology 2024-05, Vol.34 (5), p.3465-3480
Main Authors: Xu, Liuchi, Ren, Jin, Huang, Zhenhua, Zheng, Weishi, Chen, Yunwen
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!