Loading…
DQDG: Data-Free Quantization With Dual Generators for Keyword Spotting
Data-free quantization effectively compresses deep learning models with privacy guarantees. However, previous data-free quantization methods applied to keyword spotting models have the following issues: (1) The synthesized samples are excessively similar, leading to severe homogenization problems; (...
Saved in:
Published in: | IEEE signal processing letters 2024, Vol.31, p.1540-1544 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Data-free quantization effectively compresses deep learning models with privacy guarantees. However, previous data-free quantization methods applied to keyword spotting models have the following issues: (1) The synthesized samples are excessively similar, leading to severe homogenization problems; (2) The low-quality samples during the initial training hinder model fine-tuning. To address these issues, this paper proposes a novel framework called Data-Free Quantization with Dual Generator (DQDG). Our framework introduces Dual Generators with Center Distance Constraint (DGCDC) to enhance the intra-class heterogeneity of synthesized samples, and utilizes a selector to select high-quality samples to assist in model fine-tuning. Additionally, allowing the quantized model to infer complete data from masked data, we adopt Time Masking Quantization Distillation (TMQD) to improve the understanding of the data distribution. Experimental results demonstrate that DQDG outperforms existing data-free quantization methods by a large margin. |
---|---|
ISSN: | 1070-9908 1558-2361 |
DOI: | 10.1109/LSP.2024.3407481 |