Loading…
Neural Architecture Search for Time Series Classification
Neural architecture search (NAS) has achieved great success in different computer vision tasks such as object detection and image recognition. Moreover, deep learning models have millions or billions of parameters and applying NAS methods when considering a small amount of data is not trivial. Unlik...
Saved in:
Main Authors: | , , , , , , , |
---|---|
Format: | Conference Proceeding |
Language: | English |
Subjects: | |
Citations: | Items that cite this one |
Online Access: | Request full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Neural architecture search (NAS) has achieved great success in different computer vision tasks such as object detection and image recognition. Moreover, deep learning models have millions or billions of parameters and applying NAS methods when considering a small amount of data is not trivial. Unlike computer vision tasks, labeling time series data for supervised learning is a laborious and expensive task that often requires expertise. Therefore, this paper proposes a simple-yet-effective fine-tuning method based on repeated k-fold cross-validation in order to train deep residual networks using only a small amount of time series data. The main idea is that each model fitted during cross-validation will transfer its weights to the subsequent folds over the rounds. We conducted extensive experiments on 85 instances from the UCR archive for Time Series Classification (TSC) to investigate the performance of the proposed approach. The experimental results reveal that our proposed model called NAS-T reaches new state-of-the-art TSC accuracy, by designing a single classifier that is able to beat HIVE-COTE: an ensemble of 37 individual classifiers. |
---|---|
ISSN: | 2161-4407 |
DOI: | 10.1109/IJCNN48605.2020.9206721 |