Loading…
Model Averaging Estimation Method by Kullback–Leibler Divergence for Multiplicative Error Model
In this paper, we propose the model averaging estimation method for multiplicative error model and construct the corresponding weight choosing criterion based on the Kullback–Leibler divergence with a hyperparameter to avoid the problem of overfitting. The resulting model average estimator is proved...
Saved in:
Published in: | Complexity (New York, N.Y.) N.Y.), 2022-01, Vol.2022 (1) |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In this paper, we propose the model averaging estimation method for multiplicative error model and construct the corresponding weight choosing criterion based on the Kullback–Leibler divergence with a hyperparameter to avoid the problem of overfitting. The resulting model average estimator is proved to be asymptotically optimal. It is shown that the Kullback–Leibler model averaging (KLMA) estimator asymptotically minimizes the in-sample Kullback–Leibler divergence and improves the forecast accuracy of out-of-sample even under different loss functions. In simulations, we show that the KLMA estimator compares favorably with smooth-AIC estimator (SAIC), smooth-BIC estimator (SBIC), and Mallows model averaging estimator (MMA), especially when some nonlinear noise is added to the data generation process. The empirical applications in the daily range of S&P500 and price duration of IBM show that the out-of-sample forecasting capacity of the KLMA estimator is better than that of other methods. |
---|---|
ISSN: | 1076-2787 1099-0526 |
DOI: | 10.1155/2022/7706992 |