Loading…

On the performance of deep learning for numerical optimization: An application to protein structure prediction

Deep neural networks have recently drawn considerable attention to build and evaluate artificial learning models for perceptual tasks. On the other hand, optimization is the problem of selecting a set of element to find an optimal/near optimal criterion. Here, we present a study on the performance o...

Full description

Saved in:
Bibliographic Details
Published in:Applied soft computing 2021-10, Vol.110, p.107596, Article 107596
Main Authors: Rakhshani, Hojjat, Idoumghar, Lhassane, Ghambari, Soheila, Lepagnot, Julien, Brévilliers, Mathieu
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Deep neural networks have recently drawn considerable attention to build and evaluate artificial learning models for perceptual tasks. On the other hand, optimization is the problem of selecting a set of element to find an optimal/near optimal criterion. Here, we present a study on the performance of the deep learning models to deal with global optimization problems. More precisely, we would like to learn how to optimize the problems by using the machine learning techniques. Different from proposing very large networks with GPU computational burden and long training time, we focus on searching for lightweight implementations to find the best architecture. The performance of NAS is first analyzed through empirical experiments on CEC 2017 benchmark suite. Thereafter, it is applied to a set of protein structure prediction (PSP) problems. The experiments reveal that the generated learning models can achieve competitive results when compared to hand-designed algorithms; given enough computational budget. •This study revisits the common application of deep learning and reformulates it for numerical, and protein structure prediction problems.•The proposed approach adopts the idea of the neural architecture search to solve the problem at hand.•Our contribution achieved competitive performances compared to the hand-crafted algorithms.•The transfer and ensemble learnings are used to show how the optimization process can be accelerated.
ISSN:1568-4946
1872-9681
DOI:10.1016/j.asoc.2021.107596