Loading…
Symbolic representation of neural networks
Neural networks often surpass decision trees in predicting pattern classifications, but their predictions cannot be explained. This algorithm's symbolic representations make each prediction explicit and understandable. Our approach to understanding a neural network uses symbolic rules to repres...
Saved in:
Published in: | Computer (Long Beach, Calif.) Calif.), 1996-03, Vol.29 (3), p.71-77 |
---|---|
Main Authors: | , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites Items that cite this one |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Neural networks often surpass decision trees in predicting pattern classifications, but their predictions cannot be explained. This algorithm's symbolic representations make each prediction explicit and understandable. Our approach to understanding a neural network uses symbolic rules to represent the network decision process. The algorithm, NeuroRule, extracts these rules from a neural network. The network can be interpreted by the rules which, in general, preserve network accuracy and explain the prediction process. We based NeuroRule on a standard three layer feed forward network. NeuroRule consists of four phases. First, it builds a weight decay backpropagation network so that weights reflect the importance of the network's connections. Second, it prunes the network to remove irrelevant connections and units while maintaining the network's predictive accuracy. Third, it discretizes the hidden unit activation values by clustering. Finally, it extracts rules from the network with discretized hidden unit activation values. |
---|---|
ISSN: | 0018-9162 1558-0814 |
DOI: | 10.1109/2.485895 |