Conditioning Graphs: Practical Structures for Inference in Bayesian Networks.
ABSTRACT Programmers employing inference in Bayesian networks typ- ically rely on the inclusion of the model as well as an inference engine into their application. Sophisticated inference engines require non-trivial amounts of space and are also difficult to implement. This limits their use in some applications that would otherwise benefit from probabilistic inference. This paper presents a system that minimizes the space require- ment of the model. The inference engine is sufficiently simpleas to avoid space-limitation and be easily implemented in almost any environment. We show a fast, compact indexing structure that is linear in the size of the network. The additional space required to compute over the model is linear in the number of variables in the network.
- SourceAvailable from: pitt.edu
Article: Any-Space Probabilistic Inference[show abstract] [hide abstract]
ABSTRACT: We have recently introduced an any-space algorithm for exact inference in Bayesian networks, called Recursive Conditioning, RC, which allows one to trade space with time at increments of X-bytes, where X is the number of bytes needed to cache a floating point number. In this paper, we present three key extensions of RC. First, we modify the algorithm so it applies to more general factorization of probability distributions, including (but not limited to) Bayesian network factorizations. Second, we present a forgetting mechanism which reduces the space requirements of RC considerably and then compare such requirmenets with those of variable elimination on a number of realistic networks, showing orders of magnitude improvements in certain cases. Third, we present a version of RC for computing maximum a posteriori hypotheses (MAP), which turns out to be the first MAP algorithm allowing a smooth time-space tradeoff. A key advantage of presented MAP algorithm is that it does not have to start from scratch each time a new query is presented, but can reuse some of its computations across multiple queries, leading to significant savings in ceratain cases.01/2013;
- [show abstract] [hide abstract]
ABSTRACT: An important aspect of probabilistic inference in embedded real-time systems is flexibility to handle changes and limitations in space and time resources. We present algorithms for probabilistic inference that focus on simultaneous adaptation with respect to these resources. We discuss techniques to reduce memory consumption in Bayesian network inference, and then develop adaptive conditioning, an anyspace anytime algorithm that decomposes networks and applies various algorithms at once to guarantee a level of performance. We briefly describe adaptive variable elimination, an anyspace algorithm derived from variable elimination. We present tests and applications with personal digital assistants and industrial controllers.11/2002;
Conditioning Graphs: Practical
Structures for Inference in Bayesian
A Thesis Submitted to the
College of Graduate Studies and Research
in Partial Fulfillment of the Requirements
for the degree of Doctor of Philosophy
in the Department of Computer Science
University of Saskatchewan
Kevin John Grant
c ?Kevin John Grant, January/2007. All rights reserved.
Permission to Use
In presenting this thesis in partial fulfilment of the requirements for a Postgrad-
uate degree from the University of Saskatchewan, I agree that the Libraries of this
University may make it freely available for inspection. I further agree that permission
for copying of this thesis in any manner, in whole or in part, for scholarly purposes
may be granted by the professor or professors who supervised my thesis work or, in
their absence, by the Head of the Department or the Dean of the College in which
my thesis work was done. It is understood that any copying or publication or use of
this thesis or parts thereof for financial gain shall not be allowed without my written
permission. It is also understood that due recognition shall be given to me and to the
University of Saskatchewan in any scholarly use which may be made of any material
in my thesis.
Requests for permission to copy or to make other use of material in this thesis in
whole or part should be addressed to:
Head of the Department of Computer Science
176 Thorvaldson Building
110 Science Place
University of Saskatchewan
Probability is a useful tool for reasoning when faced with uncertainty. Bayesian
networks offer a compact representation of a probabilistic problem, exploiting inde-
pendence amongst variables that allows a factorization of the joint probability into
much smaller local probability distributions.
The standard approach to probabilistic inference in Bayesian networks is to com-
pile the graph into a join-tree, and perform computation over this secondary struc-
ture. While join-trees are among the most time-efficient methods of inference in
Bayesian networks, they are not always appropriate for certain applications. The
memory requirements of join-tree can be prohibitively large. The algorithms for
computing over join-trees are large and involved, making them difficult to port to
other systems or be understood by general programmers without Bayesian network
This thesis proposes a different method for probabilistic inference in Bayesian
networks. We present a data structure called a conditioning graph, which is a run-
time representation of Bayesian network inference. The structure mitigates many of
the problems of join-tree inference. For example, conditioning graphs require much
less space to store and compute over. The algorithm for calculating probabilities
from a conditioning graph is small and basic, making it portable to virtually any
architecture. And the details of Bayesian network inference are compiled away dur-
ing the construction of the conditioning graph, leaving an intuitive structure that is
easy to understand and implement without any Bayesian network expertise.
In addition to the conditioning graph architecture, we present several improve-
ments to the model, that maintain its small and simplistic style while reducing the
runtime required for computing over it. We present two heuristics for choosing vari-
able orderings that result in shallower elimination trees, reducing the overall com-
plexity of computing over conditioning graphs. We also demonstrate several compile
and runtime extensions to the algorithm, that can produce substantial speedup to
the algorithm while adding a small space constant to the implementation. We also
show how to cache intermediate values in conditioning graphs during probabilis-
tic computation, that allows conditioning graphs to perform at the same speed as
standard methods by avoiding duplicate computation, at the price of more memory.
The methods presented also conform to the basic style of the original algorithm.
We demonstrate a novel technique for reducing the amount of required memory for
We demonstrate empirically the compactness, portability, and ease of use of con-
ditioning graphs. We also show that the optimizations of conditioning graphs allow
competitive behaviour with standard methods in many circumstances, while still pre-
serving its small and simple style. Finally, we show that the memory required under
caching can be quite modest, meaning that conditioning graphs can be competitive
with standard methods in terms of time, using a fraction of the memory.
A graduate degree is not an individual effort. It is the collaboration of many individuals,
direct and indirect. I mention a few here, but am grateful to all who were a part of this.
First, I wish to thank my supervisor, Michael Horsch. Mike was that supervisor that
every grad student hopes for. He always had time and patience to discuss ideas, both good
and bad. His knowledge of my research topics helped to keep me progressing. Most of all,
I am thankful for his friendship. His faith in me was always an encouragement. Thanks
My graduate studies program was funded in large part by the National Sciences and
Engineering Research Council of Canada (NSERC). Their support is greatly appreciated.
I am grateful for the guidance of my thesis committee members, whom include Eric
Neufeld, Mark Keil, Winfried Grassmann, Mik Bickis, and Eugene Santos. Thank you for
I would like to express my gratitude to the people of our department. I would like to
thank our department heads (Jim Greer and Kevin Schneider) for giving me the opportu-
nity to teach. During my study, I have called upon the expertise of many members of our
faculty; thank you all for your help. I would also like to acknowledge the quiet, tireless
efforts of our support staff and office staff, for your help and patience in making sure my
computers ran, my papers printed, and my deadlines were met. Finally, a special thanks
to Eric Neufeld. When Mike left on sabbatical, I often called upon Eric for his expertise
in academic matters, and he always made time to answer my myriad of questions.
I could not have done this without my family. To Kelly, Jessie, Stephanie and Aaron,
my grandparents, and Maureen’s family (to name only a few), thanks for all your support.
I especially wish to thank my parents.Their unfailing love, support, friendship, and
encouragement is the reason for any successes that I may enjoy.
Most of all, I wish to thank Maureen and Samantha. Maureen has given so much
during my study, and taken so little in return. Postgraduate education entails sacrifices,
and the two of you have made them without question.
Saskatoon, SaskatchewanKevin John Grant
January 9, 2007
Michael C. Horsch