Loading…

CDCP: A Framework for Accelerating Dynamic Channel Pruning Methods

Dynamic channel pruning is a technique aimed at reducing the theoretical computational complexity and inference latency of convolutional neural networks. Dynamic channel pruning methods introduce complex additional modules for dynamically selecting channels for images. Due to the additional modules,...

Full description

Saved in:
Bibliographic Details
Main Authors: Xiang, Zhaohong, Luo, Yigui, Xie, Yin, She, Haihong, Liang, Weiming, Zhao, Laigang
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Dynamic channel pruning is a technique aimed at reducing the theoretical computational complexity and inference latency of convolutional neural networks. Dynamic channel pruning methods introduce complex additional modules for dynamically selecting channels for images. Due to the additional modules, dynamic channel pruning methods never achieve optimal acceleration effect in real world. To address this problem, we propose Consecutive Dynamic Channel Pruning (CDCP), a novel dynamic channel pruning framework unified for almost all dynamic pruning methods designed for continuous image processing. The core idea of CDCP stems from our observation that adjusting the network for all frames in semantically continuous scenes is unnecessary since adjacent frames often share similar network structures in dynamic channel pruning. CDCP introduces a simple binary classifier to determine whether the network structure needs to be adjusted for a new frame. Our method can also be used for semantically non-continuous image processing tasks with a slightly lower probability of model reuse. We validate the effectiveness of CDCP on three dynamic channel pruning methods and better acceleration effects are achieved when applied them with CDCP to the semantically continuous Waymo dataset, the nuScenes dataset, and the semantically discontinuous COCO dataset.
ISSN:2690-5965
DOI:10.1109/ICPADS60453.2023.00031