Loading…

ProX: A Reversed Once-for-All Network Training Paradigm for Efficient Edge Models Training in Medical Imaging

The usage of edge models in medical field has a huge impact on promoting the accessibility of real-time medical services in the under-developed regions. However, the handling of latency-accuracy trade-off to produce such an edge model is very challenging. Although the recent Once-For-All (OFA) netwo...

Full description

Saved in:
Bibliographic Details
Main Authors: Lim, Shin Wei, Seng Chan, Chee, Mohd Faizal, Erma Rahayu, Howg Ewe, Kok
Format: Conference Proceeding
Language:English
Subjects:
Online Access:Request full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The usage of edge models in medical field has a huge impact on promoting the accessibility of real-time medical services in the under-developed regions. However, the handling of latency-accuracy trade-off to produce such an edge model is very challenging. Although the recent Once-For-All (OFA) network is able to directly produce a set of sub-network designs with Progressive Shrinking (PS) algorithm, it still suffers from training resource and time inefficiency downfall. In this paper, we propose a new OFA training algorithm, namely the Progressive Expansion (ProX). Empirically, we showed that the proposed paradigm can reduce training time up to 68%; while still able to produce sub-networks that have either similar or better accuracy compared to those trained with OFA-PS in ROCT (classification), BRATS and Hippocampus (3D-segmentation) public medical datasets.
ISSN:2381-8549
DOI:10.1109/ICIP46576.2022.9897495