Loading…
Parameterized Convex Universal Approximators for Decision-Making Problems
Parameterized max-affine (PMA) and parameterized log-sum-exp (PLSE) networks are proposed for general decision-making problems. The proposed approximators generalize existing convex approximators, namely max-affine (MA) and log-sum-exp (LSE) networks, by considering function arguments of condition a...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2024-02, Vol.35 (2), p.2448-2459 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Parameterized max-affine (PMA) and parameterized log-sum-exp (PLSE) networks are proposed for general decision-making problems. The proposed approximators generalize existing convex approximators, namely max-affine (MA) and log-sum-exp (LSE) networks, by considering function arguments of condition and decision variables and replacing the network parameters of MA and LSE networks with continuous functions with respect to the condition variable. The universal approximation theorem (UAT) of PMA and PLSE is proved, which implies that PMA and PLSE are shape-preserving universal approximators for parameterized convex continuous functions. Practical guidelines for incorporating deep neural networks within PMA and PLSE networks are provided. A numerical simulation is performed to demonstrate the performance of the proposed approximators. The simulation results support that PLSE outperforms other existing approximators in terms of a minimizer and optimal value errors with scalable and efficient computation for high-dimensional cases. |
---|---|
ISSN: | 2162-237X 2162-2388 |
DOI: | 10.1109/TNNLS.2022.3190198 |