Loading…

Compressive statistical learning with random feature moments

We describe a general framework — compressive statistical learning — for resource-efficient large-scale learning: the training collection is compressed in one pass into a low-dimensional sketch (a vector of random empirical generalized moments) that captures the information relevant to the considere...

Full description

Saved in:
Bibliographic Details
Published in:Mathematical statistics and learning (Online) 2021-08, Vol.3 (2), p.113-164
Main Authors: Gribonval, Rémi, Blanchard, Gilles, Keriven, Nicolas, Traonmilin, Yann
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We describe a general framework — compressive statistical learning — for resource-efficient large-scale learning: the training collection is compressed in one pass into a low-dimensional sketch (a vector of random empirical generalized moments) that captures the information relevant to the considered learning task. A near-minimizer of the risk is computed from the sketch through the solution of a nonlinear least squares problem. We investigate sufficient sketch sizes to control the generalization error of this procedure. The framework is illustrated on compressive PCA, compressive clustering, and compressive Gaussian mixture Modeling with fixed known variance. The latter two are further developed in a companion paper.
ISSN:2520-2316
2520-2324
DOI:10.4171/msl/20