Loading…

Approximations of conditional probability density functions in Lebesgue spaces via mixture of experts models

Mixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness of the class of MoE models by proving denseness results in Lebesgue spaces, when inputs and outputs variables are both compactly supported. We further prove an almo...

Full description

Saved in:
Bibliographic Details
Published in:Journal of statistical distributions and applications 2021-08, Vol.8 (1), Article 13
Main Authors: Nguyen, Hien Duy, Nguyen, TrungTin, Chamroukhi, Faicel, McLachlan, Geoffrey John
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Mixture of experts (MoE) models are widely applied for conditional probability density estimation problems. We demonstrate the richness of the class of MoE models by proving denseness results in Lebesgue spaces, when inputs and outputs variables are both compactly supported. We further prove an almost uniform convergence result when the input is univariate. Auxiliary lemmas are proved regarding the richness of the soft-max gating function class, and their relationships to the class of Gaussian gating functions.
ISSN:2195-5832
2195-5832
DOI:10.1186/s40488-021-00125-0