Loading…

Representing and learning Boolean functions of multivalued features

An analysis and empirical measurement of threshold linear functions of multivalued features is presented. The number of thresholded linear functions, maximum weight size, training speed, and the number of nodes necessary to represent arbitrary Boolean functions are all shown to increase polynomially...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on systems, man, and cybernetics man, and cybernetics, 1990-01, Vol.20 (1), p.67-80
Main Authors: Hampson, S.E., Volper, D.J.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:An analysis and empirical measurement of threshold linear functions of multivalued features is presented. The number of thresholded linear functions, maximum weight size, training speed, and the number of nodes necessary to represent arbitrary Boolean functions are all shown to increase polynomially with the number of distinct values the input features can assume and exponentially with the number of features. Two network training algorithms, focusing and back propagation, are described. Empirically, they are capable of learning arbitrary Boolean functions of multivalued features in a two-level net. Focusing is proved to converge to a correct classification and permits some time-space complexity analysis. Training time for this algorithm is polynomial in the number of values of a feature can assume, and exponential in the number of features. Back propagation is not necessarily convergent, but for randomly generated Boolean functions, the empirical behavior of the implementation is similar to that of the focusing algorithm.< >
ISSN:0018-9472
2168-2909
DOI:10.1109/21.47810