Loading…

SLICED INVERSE REGRESSION IN METRIC SPACES

In this article, we propose a general nonlinear sufficient dimension reduction (SDR) framework when both the predictor and the response lie in some general metric spaces. We construct reproducing kernel Hilbert spaces with kernels that are fully determined by the distance functions of the metric spa...

Full description

Saved in:
Bibliographic Details
Published in:Statistica Sinica 2022-01, Vol.32, p.2315-2337
Main Authors: Virta, Joni, Lee, Kuang-Yao, Li, Lexin
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this article, we propose a general nonlinear sufficient dimension reduction (SDR) framework when both the predictor and the response lie in some general metric spaces. We construct reproducing kernel Hilbert spaces with kernels that are fully determined by the distance functions of the metric spaces, and then leverage the inherent structures of these spaces to define a nonlinear SDR framework. We adapt the classical sliced inverse regression within this framework for the metric space data. Next we build an estimator based on the corresponding linear operators, and show that it recovers the regression information in an unbiased manner. We derive the estimator at both the operator level and under a coordinate system, and establish its convergence rate. Lastly, we illustrate the proposed method using synthetic and real data sets that exhibit non-Euclidean geometry.
ISSN:1017-0405
1996-8507
DOI:10.5705/ss.202022.0097