Loading…

Learning symmetric causal independence models

Causal independence modelling is a well-known method for reducing the size of probability tables, simplifying the probabilistic inference and explaining the underlying mechanisms in Bayesian networks. Recently, a generalization of the widely-used noisy OR and noisy AND models, causal independence mo...

Full description

Saved in:
Bibliographic Details
Published in:Machine learning 2008-06, Vol.71 (2-3), p.133-153
Main Authors: Jurgelenaite, Rasa, Heskes, Tom
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Causal independence modelling is a well-known method for reducing the size of probability tables, simplifying the probabilistic inference and explaining the underlying mechanisms in Bayesian networks. Recently, a generalization of the widely-used noisy OR and noisy AND models, causal independence models based on symmetric Boolean functions, was proposed. In this paper, we study the problem of learning the parameters in these models, further referred to as symmetric causal independence models. We present a computationally efficient EM algorithm to learn parameters in symmetric causal independence models, where the computational scheme of the Poisson binomial distribution is used to compute the conditional probabilities in the E-step. We study computational complexity and convergence of the developed algorithm. The presented EM algorithm allows us to assess the practical usefulness of symmetric causal independence models. In the assessment, the models are applied to a classification task; they perform competitively with state-of-the-art classifiers.
ISSN:0885-6125
1573-0565
DOI:10.1007/s10994-007-5041-7