Loading…
Building cost functions minimizing to some summary statistics
A learning machine-or a model-is usually trained by minimizing a given criterion (the expectation of the cost function), measuring the discrepancy between the model output and the desired output. As is already well known, the choice of the cost function has a profound impact on the probabilistic int...
Saved in:
Published in: | IEEE transaction on neural networks and learning systems 2000-11, Vol.11 (6), p.1263-1271 |
---|---|
Main Author: | |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | A learning machine-or a model-is usually trained by minimizing a given criterion (the expectation of the cost function), measuring the discrepancy between the model output and the desired output. As is already well known, the choice of the cost function has a profound impact on the probabilistic interpretation of the output of the model, after training. In this work, we use the calculus of variations in order to tackle this problem. In particular, we derive necessary and sufficient conditions on the cost function ensuring that the output of the trained model approximates 1) the conditional expectation of the desired output given the explanatory variables; 2) the conditional median (and, more generally the q-quantile); 3) the conditional geometric mean; and 4) the conditional variance. The same method could be applied to the estimation of other summary statistics as well. We also argue that the least absolute deviations criterion could, in some cases, act as an alternative to the ordinary least squares criterion for nonlinear regression. In the same vein, the concept of "regression quantile" is briefly discussed. |
---|---|
ISSN: | 1045-9227 2162-237X 1941-0093 2162-2388 |
DOI: | 10.1109/72.883416 |