Loading…

A "Microscopic" Study of Minimum Entropy Search in Learning Decomposable Markov Networks

Several scoring metrics are used in different search procedures for learning probabilistic networks. We study the properties of cross entropy in learning a decomposable Markov network. Though entropy and related scoring metrics were widely used, its "microscopic" properties and asymptotic...

Full description

Saved in:
Bibliographic Details
Published in:Machine learning 1997-01, Vol.26 (1), p.65
Main Authors: Xiang, Y, Wong, Skm, Cercone, N
Format: Article
Language:English
Subjects:
Citations: Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Several scoring metrics are used in different search procedures for learning probabilistic networks. We study the properties of cross entropy in learning a decomposable Markov network. Though entropy and related scoring metrics were widely used, its "microscopic" properties and asymptotic behavior in a search have not been analyzed. We present such a "microscopic" study of a minimum entropy search algorithm, and show that it learns an I-map of the domain model when the data size is large. Search procedures that modify a network structure one link at a time have been commonly used for efficiency. Our study indicates that a class of domain models cannot be learned by such procedures. This suggests that prior knowledge about the problem domain together with a multi-link search strategy would provide an effective way to uncover many domain models.[PUBLICATION ABSTRACT]
ISSN:0885-6125
1573-0565
DOI:10.1023/A:1007324100110