Loading…

Performance analysis of greedy algorithms for minimising a Maximum Mean Discrepancy

We analyse the performance of several iterative algorithms for the quantisation of a probability measure μ , based on the minimisation of a Maximum Mean Discrepancy (MMD). Our analysis includes kernel herding, greedy MMD minimisation and Sequential Bayesian Quadrature (SBQ). We show that the finite-...

Full description

Saved in:
Bibliographic Details
Published in:Statistics and computing 2023-02, Vol.33 (1), Article 14
Main Author: Pronzato, Luc
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:We analyse the performance of several iterative algorithms for the quantisation of a probability measure μ , based on the minimisation of a Maximum Mean Discrepancy (MMD). Our analysis includes kernel herding, greedy MMD minimisation and Sequential Bayesian Quadrature (SBQ). We show that the finite-sample-size approximation error, measured by the MMD, decreases as 1/ n for SBQ and also for kernel herding and greedy MMD minimisation when using a suitable step-size sequence. The upper bound on the approximation error is slightly better for SBQ, but the other methods are significantly faster, with a computational cost that increases only linearly with the number of points selected. This is illustrated by two numerical examples, with the target measure μ being uniform (a space-filling design application) and with μ a Gaussian mixture. They suggest that the bounds derived in the paper are overly pessimistic, in particular for SBQ. The sources of this pessimism are identified but seem difficult to counter.
ISSN:0960-3174
1573-1375
DOI:10.1007/s11222-022-10184-1