Loading…

Analysis of the IJCNN 2011 UTL challenge

We organized a challenge in “Unsupervised and Transfer Learning”: the UTL challenge (http://clopinet.com/ul). We made available large datasets from various application domains: handwriting recognition, image recognition, video processing, text processing, and ecology. The goal was to learn data repr...

Full description

Saved in:
Bibliographic Details
Published in:Neural networks 2012-08, Vol.32, p.174-178
Main Authors: Guyon, Isabelle, Dror, Gideon, Lemaire, Vincent, Silver, Daniel L., Taylor, Graham, Aha, David W.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We organized a challenge in “Unsupervised and Transfer Learning”: the UTL challenge (http://clopinet.com/ul). We made available large datasets from various application domains: handwriting recognition, image recognition, video processing, text processing, and ecology. The goal was to learn data representations that capture regularities of an input space for re-use across tasks. The representations were evaluated on supervised learning “target tasks” unknown to the participants. The first phase of the challenge was dedicated to “unsupervised transfer learning” (the competitors were given only unlabeled data). The second phase was dedicated to “cross-task transfer learning” (the competitors were provided with a limited amount of labeled data from “source tasks”, distinct from the “target tasks”). The analysis indicates that learned data representations yield significantly better results than those obtained with original data or data preprocessed with standard normalizations and functional transforms.
ISSN:0893-6080
1879-2782
DOI:10.1016/j.neunet.2012.02.010