Loading…
Sequential selection of discrete features for neural networks – A Bayesian approach to building a cascade
A feature selection procedure is used to successively remove features one-by-one from a statistical classifier by an iterative backward search. Each classifier uses a smaller subset of features than the classifier in the previous iteration. The classifiers are subsequently combined into a cascade. E...
Saved in:
Published in: | Pattern recognition letters 1999-11, Vol.20 (11), p.1439-1448 |
---|---|
Main Authors: | , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | A feature selection procedure is used to successively remove features one-by-one from a statistical classifier by an iterative backward search. Each classifier uses a smaller subset of features than the classifier in the previous iteration. The classifiers are subsequently combined into a cascade. Each classifier in the cascade should classify cases to which a reliable class label can be assigned. Other cases should be propagated to the next classifier which uses also the value of a new feature. Experiments demonstrate the feasibility of building cascades of classifiers (neural networks for prediction of atrial fibrillation (FA)) using a backward search scheme for feature selection. |
---|---|
ISSN: | 0167-8655 1872-7344 |
DOI: | 10.1016/S0167-8655(99)00112-9 |