Loading…
Comparison of locally weighted PLS strategies for regression and discrimination on agronomic NIR data
In multivariate calibrations, locally weighted partial least squared regression (LWPLSR) is an efficient prediction method when heterogeneity of data generates nonlinear relations (curvatures and clustering) between the response and the explicative variables. This is frequent in agronomic data sets...
Saved in:
Published in: | Journal of chemometrics 2020-05, Vol.34 (5), p.n/a |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | In multivariate calibrations, locally weighted partial least squared regression (LWPLSR) is an efficient prediction method when heterogeneity of data generates nonlinear relations (curvatures and clustering) between the response and the explicative variables. This is frequent in agronomic data sets that gather materials of different natures or origins. LWPLSR is a particular case of weighted PLSR (WPLSR; ie, a statistical weight different from the standard 1/n is given to each of the n calibration observations for calculating the PLS scores/loadings and the predictions). In LWPLSR, the weights depend from the dissimilarity (which has to be defined and calculated) to the new observation to predict. This article compares two strategies of LWPLSR: (a) “LW”: the usual strategy where, for each new observation to predict, a WPLSR is applied to the n calibration observations (ie, entire calibration set) vs (b) “KNN‐LW”: a number of k nearest neighbors to the observation to predict are preliminary selected in the training set and WPLSR is applied only to this selected KNN set. On three illustrating agronomic data sets (quantitative and discrimination predictions), both strategies overpassed the standard PLSR. LW and KNN‐LW had close prediction performances, but KNN‐LW was much faster in computation time. KNN‐LW strategy is therefore recommended for large data sets. The article also presents a new algorithm for WPLSR, on the basis of the “improved kernel #1” algorithm, which is competitor and in general faster to the already published weighted PLS nonlinear iterative partial least squares (NIPALS).
Locally weighted partial least squared regression (LWPLSR) is a particular case of weighted PLSR (WPLSR) where the weights, given to the calibration observations for calculating the PLS scores/loadings and the prediction, depend on the dissimilarity to the new observation to predict. This article compares two strategies of LWPLSR: (a) “LW”: the usual LWPLSR strategy vs (b) “KNN‐LW”: a number of k nearest neighbors to the observation to predict are preliminary selected and WPLSR is applied only to these neighbors. |
---|---|
ISSN: | 0886-9383 1099-128X |
DOI: | 10.1002/cem.3209 |