Inference in Hybrid Bayesian Networks Using Mixtures of Gaussians

View/ Open
Issue Date
2006-07Author
Shenoy, Prakash P.
Publisher
AUAI Press, Corvallis, OR
Format
2551666 bytes
Type
Article
Metadata
Show full item recordAbstract
The main goal of this paper is to describe a method for exact inference in general hybrid Bayesian networks (BNs) (with a mixture of discrete and continuous chance variables). Our method consists of approximating general hybrid Bayesian networks by a mixture of Gaussians
(MoG) BNs. There exists a fast algorithm by Lauritzen-Jensen (LJ) for making exact inferences in MoG Bayesian networks, and there exists a commercial implementation of this algorithm. However, this algorithm can only be
used for MoG BNs. Some limitations of such networks are as follows. All continuous chance variables must have conditional linear Gaussian distributions, and discrete chance nodes cannot have continuous parents. The methods described in this paper will enable us to use the LJ algorithm for a bigger class of hybrid Bayesian
networks. This includes networks with continuous chance nodes with non-Gaussian distributions, networks with no restrictions on the topology of discrete and continuous
variables, networks with conditionally deterministic variables that are a nonlinear function of their continuous parents, and networks with continuous chance variables
whose variances are functions of their parents.
ISBN
0-9749039-2-2Collections
Citation
Shenoy, P. P., "Inference in hybrid Bayesian networks using mixtures of Gaussians," in R. Dechter and T. Richardson (eds.), Uncertainty in Artificial Intelligence: Proceedings of the Twenty-Second Conference (UAI-06), 2006, pp. 428--436, AUAI Press, Corvallis, OR.
Items in KU ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.
We want to hear from you! Please share your stories about how Open Access to this item benefits YOU.