Loading…
Maximum Mean Discrepancy Distributionally Robust Nonlinear Chance-Constrained Optimization with Finite-Sample Guarantee
This paper is motivated by addressing open questions in distributionally robust chance-constrained programs (DRCCP) using the popular Wasserstein ambiguity sets. Specifically, the computational techniques for those programs typically place restrictive assumptions on the constraint functions and the...
Saved in:
Published in: | arXiv.org 2022-04 |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | This paper is motivated by addressing open questions in distributionally robust chance-constrained programs (DRCCP) using the popular Wasserstein ambiguity sets. Specifically, the computational techniques for those programs typically place restrictive assumptions on the constraint functions and the size of the Wasserstein ambiguity sets is often set using costly cross-validation (CV) procedures or conservative measure concentration bounds. In contrast, we propose a practical DRCCP algorithm using kernel maximum mean discrepancy (MMD) ambiguity sets, which we term MMD-DRCCP, to treat general nonlinear constraints without using ad-hoc reformulation techniques. MMD-DRCCP can handle general nonlinear and non-convex constraints with a proven finite-sample constraint satisfaction guarantee of a dimension-independent \(\mathcal{O}(\frac{1}{\sqrt{N}})\) rate, achievable by a practical algorithm. We further propose an efficient bootstrap scheme for constructing sharp MMD ambiguity sets in practice without resorting to CV. Our algorithm is validated numerically on a portfolio optimization problem and a tube-based distributionally robust model predictive control problem with non-convex constraints. |
---|---|
ISSN: | 2331-8422 |