Loading…

Evolution and generalization of a single neurone:: II. Complexity of statistical classifiers and sample size considerations

Unlike many other investigations on this topic, the present one does not consider the nonlinear SLP as a single special type of the classification rule. In SLP training we can obtain seven statistical classifiers of differing complexity: (1) the Euclidean distance classifier; (2) the standard Fisher...

Full description

Saved in:
Bibliographic Details
Published in:Neural networks 1998-03, Vol.11 (2), p.297-313
Main Author: RAUDYS, S
Format: Article
Language:English
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Unlike many other investigations on this topic, the present one does not consider the nonlinear SLP as a single special type of the classification rule. In SLP training we can obtain seven statistical classifiers of differing complexity: (1) the Euclidean distance classifier; (2) the standard Fisher linear discriminant function (DF); (3) the Fisher linear DF with pseudo-inversion of the covariance matrix; (4) regularized linear discriminant analysis; (5) the generalized Fisher DF; (6) the minimum empirical error classifier; and (7) the maximum margin classifier. A survey of earlier and new results, referring to relationships between the complexity of six classifiers, generalization error, and the number of learning examples, is presented. These relationships depend on the complexities of both the classifier and the data. This knowledge indicates how to control the SLP classifier complexity purposefully by determining optimal values of the targets, learning-step and its change in the training process, the number of iterations, and addition or subtraction of a regularization term. A correct initialization of weights, and a simplifying data structure can help to reduce the generalization error.
ISSN:0893-6080
1879-2782
DOI:10.1016/S0893-6080(97)00136-6