Loading…

Sharpening the Toolbox of Computational Chemistry: A New Approximation of Critical F-Values for Multiple Linear Regression

Multiple linear regression is a major tool in computational chemistry. Although it has been used for more than 30 years, it has only recently been noted within the cheminformatics community that the standard F-values used to assess the significance of the resulting models are inappropriate in situat...

Full description

Saved in:
Bibliographic Details
Published in:Journal of Chemical Information and Modeling 2009-01, Vol.49 (1), p.28-34
Main Authors: Kramer, Christian, Tautermann, Christofer S, Livingstone, David J, Salt, David W, Whitley, David C, Beck, Bernd, Clark, Timothy
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Multiple linear regression is a major tool in computational chemistry. Although it has been used for more than 30 years, it has only recently been noted within the cheminformatics community that the standard F-values used to assess the significance of the resulting models are inappropriate in situations where the variables included in a model are chosen from a large pool of descriptors, due to an effect known in the statistical literature as selection bias. We have used Monte Carlo simulations to estimate the critical F-values for many combinations of sample size (n), model size (p), and descriptor pool size (k), using stepwise regression, one of the methods most commonly used to derive linear models from large sets of molecular descriptors. The values of n, p, and k represent cases appropriate to contemporary cheminformatics data sets. A formula for general n, p, and k values has been developed from the numerical estimates that approximates the critical stepwise F-values at 90%, 95%, and 99% significance levels. This approximation reproduces both the original simulated values and an interpolation test set (within the range of the training values) with an R 2 value greater than 0.995. For an extrapolation test set of cases outside the range of the training set, the approximation produced an R 2 above 0.93.
ISSN:1549-9596
1520-5142
1549-960X
DOI:10.1021/ci800318q