Loading…
The KL estimator for the inverse Gaussian regression model
Multicollinearity poses an undesirable effect on the efficiency of the maximum likelihood estimator (MLE) in both Gaussian and non‐Gaussian regression models. The ridge and the Liu estimators have been developed as an alternative to the MLE. Both estimators possess smaller mean squared error (MSE) o...
Saved in:
Published in: | Concurrency and computation 2021-07, Vol.33 (13), p.n/a |
---|---|
Main Authors: | , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Multicollinearity poses an undesirable effect on the efficiency of the maximum likelihood estimator (MLE) in both Gaussian and non‐Gaussian regression models. The ridge and the Liu estimators have been developed as an alternative to the MLE. Both estimators possess smaller mean squared error (MSE) over the MLE. Recently, Kibria and Lukman developed KL estimator, which was found to outperform the ridge and the Liu estimators in the linear regression model. With this expectation, we developed the KL estimator for the inverse Gaussian regression model. We compare the proposed estimator's performance with some existing estimators in terms of theoretical comparison, the simulation study, and real‐life application. Smaller MSE criterion shows that the proposed estimator with one of its shrinkage parameter performs the best. |
---|---|
ISSN: | 1532-0626 1532-0634 |
DOI: | 10.1002/cpe.6222 |