Article
Computing Posterior Probabilities of Structural Features in Bayesian Networks
05/2012;
Source: arXiv

Conference Paper: Ancestor Relations in the Presence of Unobserved Variables.
[Show abstract] [Hide abstract]
ABSTRACT: Bayesian networks (BNs) are an appealing model for causal and noncausal dependencies among a set of variables. Learning BNs from observational data is challenging due to the nonidentifiability of the network structure and model misspecification in the presence of unobserved (latent) variables. Here, we investigate the prospects of Bayesian learning of ancestor relations, including arcs, in the presence and absence of unobserved variables. An exact dynamic programming algorithm to compute the respective posterior probabilities is developed, under the complete data assumption. Our experimental results show that ancestor relations between observed variables, arcs in particular, can be learned with good power even when a majority of the involved variables are unobserved. For comparison, deduction of ancestor relations from single maximum a posteriori network structures or their Markov equivalence class appears somewhat inferior to Bayesian averaging. We also discuss some shortcomings of applying existing conditional independence test based methods for learning ancestor relations.Machine Learning and Knowledge Discovery in Databases  European Conference, ECML PKDD 2011, Athens, Greece, September 59, 2011, Proceedings, Part II; 01/2011 
Conference Paper: Exact Structure Discovery in Bayesian Networks with Less Space.
[Show abstract] [Hide abstract]
ABSTRACT: The fastest known exact algorithms for score based structure discovery in Bayesian net works on n nodes run in time and space 2nnO(1). The usage of these algorithms is limited to networks on at most around 25 nodes mainly due to the space requirement. Here, we study spacetime tradeoffs for find ing an optimal network structure. When lit tle space is available, we apply the Gurevich Shelah recurrence—originally proposed for the Hamiltonian path problem—and obtainUAI 2009, Proceedings of the TwentyFifth Conference on Uncertainty in Artificial Intelligence, Montreal, QC, Canada, June 1821, 2009; 01/2009  [Show abstract] [Hide abstract]
ABSTRACT: We study the problem of learning Bayesian network structures from data. We develop an algorithm for finding the kbest Bayesian network structures. We propose to compute the posterior probabilities of hypotheses of interest by Bayesian model averaging over the kbest Bayesian networks. We present empirical results on structural discovery over several real and synthetic data sets and show that the method outperforms the model selection method and the state oftheart MCMC methods.03/2012;
Data provided are for informational purposes only. Although carefully collected, accuracy cannot be guaranteed. The impact factor represents a rough estimation of the journal's impact factor and does not reflect the actual current impact factor. Publisher conditions are provided by RoMEO. Differing provisions from the publisher's actual policy or licence agreement may be applicable.