Loading…

Efficient Self-learning Evolutionary Neural Architecture Search

The evolutionary algorithm has become a major method for neural architecture search recently. However, the fixed probability distribution employed by the traditional evolutionary algorithm may lead to structural complexity and redundancy due to its inability to control the size of individual archite...

Full description

Saved in:
Bibliographic Details
Published in:Applied soft computing 2023-10, Vol.146, p.110671, Article 110671
Main Authors: Qiu, Zhengzhong, Bi, Wei, Xu, Dong, Guo, Hua, Ge, Hongwei, Liang, Yanchun, Lee, Heow Pueh, Wu, Chunguo
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The evolutionary algorithm has become a major method for neural architecture search recently. However, the fixed probability distribution employed by the traditional evolutionary algorithm may lead to structural complexity and redundancy due to its inability to control the size of individual architectures, and it cannot learn from empirical information gathered during the search process to guide the subsequent search more effectively and efficiently. Moreover, evaluating the performance of all the searched architectures requires significant computing resources and time overhead. To overcome these challenges, we present the Efficient Self-learning Evolutionary Neural Architecture Search (ESE-NAS) method. Firstly, we propose an Adaptive Learning Strategy for Mutation Sampling, composed of a Model Size Control module and a Credit Assignment method for Mutation Candidates, to guide the search process by learning from the model size information and evaluation results of the architectures and adjusting the probability distributions for evolution sampling accordingly. Additionally, we developed a neural architecture performance predictor to further improve the efficiency of NAS. Experiments on CIFAR-10 and CIFAR-100 datasets show that ESE-NAS significantly brings forward the first hitting time of the optimal architectures and reaches a competitive performance level with classic manual-designed and NAS models while maintaining structural simplicity and efficiency. •Adaptive Learning Strategy for mutation sampling makes evolutionary neural architecture searching more flexible and more efficient.•Model Size Control module ensures the compactness and simplicity of neural architectures.•Credit Assignment for Mutation Candidate method provides effective guidance to the evolution by indentifying the promissing mutation objects according to the performance difference.•Performance predictor largely accelerates the evaluation process with selected regressors and sampled training data.
ISSN:1568-4946
1872-9681
DOI:10.1016/j.asoc.2023.110671