Loading…

Automatic Searching and Pruning of Deep Neural Networks for Medical Imaging Diagnostic

The field of medical imaging diagnostic makes use of a modality of imaging tests, e.g., X-rays, ultrasounds, computed tomographies, and magnetic resonance imaging, to assist physicians with the diagnostic of patients' illnesses. Due to their state-of-the-art results in many challenging image cl...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transaction on neural networks and learning systems 2021-12, Vol.32 (12), p.5664-5674
Main Authors: Fernandes, Francisco Erivaldo, Yen, Gary G.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The field of medical imaging diagnostic makes use of a modality of imaging tests, e.g., X-rays, ultrasounds, computed tomographies, and magnetic resonance imaging, to assist physicians with the diagnostic of patients' illnesses. Due to their state-of-the-art results in many challenging image classification tasks, deep neural networks (DNNs) are suitable tools for use by physicians to provide diagnostic support when dealing with medical images. To further advance the field, the present work proposes a two-phase algorithm capable of automatically generating compact DNN architectures given a database, called here DNNDeepeningPruning. In the first phase, also called the deepening phase, the algorithm grows a DNN by adding blocks of residual layers one after another until the model overfits the given data. In the second phase, called the pruning phase, the algorithm prunes the created DNN model from the first phase to produce a DNN with a small amount of floating-point operations guided by some preference given by the user. The proposed algorithm unifies the two separate fields of DNN architecture searching and pruning under a single framework, and it is tested in two medical imaging data sets with satisfactory results.
ISSN:2162-237X
2162-2388
DOI:10.1109/TNNLS.2020.3027308