Loading…

Optimized realization of Bayesian networks in reduced normal form using latent variable model

Bayesian networks in their Factor Graph Reduced Normal Form are a powerful paradigm for implementing inference graphs. Unfortunately, the computational and memory costs of these networks may be considerable even for relatively small networks, and this is one of the main reasons why these structures...

Full description

Saved in:
Bibliographic Details
Published in:Soft computing (Berlin, Germany) Germany), 2021-05, Vol.25 (10), p.7029-7040
Main Authors: Gennaro, Giovanni Di, Buonanno, Amedeo, Palmieri, Francesco A. N.
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Bayesian networks in their Factor Graph Reduced Normal Form are a powerful paradigm for implementing inference graphs. Unfortunately, the computational and memory costs of these networks may be considerable even for relatively small networks, and this is one of the main reasons why these structures have often been underused in practice. In this work, through a detailed algorithmic and structural analysis, various solutions for cost reduction are proposed. Moreover, an online version of the classic batch learning algorithm is also analysed, showing very similar results in an unsupervised context but with much better performance; which may be essential if multi-level structures are to be built. The solutions proposed, together with the possible online learning algorithm, are included in a C++ library that is quite efficient, especially if compared to the direct use of the well-known sum-product and Maximum Likelihood algorithms. The results obtained are discussed with particular reference to a Latent Variable Model structure.
ISSN:1432-7643
1433-7479
DOI:10.1007/s00500-021-05642-3