Kari Rantanen defends PhD thesis

On December 8, 2021, Kari Rantanen of the Constraint Reasoning and Optimization group successfully defended his doctoral thesis Optimization Algorithms for Learning Graphical Model Structures.

 

M.Sc. Kari Rantanen succesfully defended his doctoral dissertation Optimization Algorithms for Learning Graphical Model Structures on December 8, 2021. The thesis work was conducted in the Constraint Reasoning and Optimization group of the Department of Computer Science at the University of Helsinki, with Professor Matti Järvisalo and Dr. Antti Hyttinen as the PhD supervisors. Professor Peter van Beek (University of Waterloo, Canada) acted as the opponent. Congratulations Kari!

An electronic version of the doctoral dissertation is available on the e-thesis site of the University of Helsinki at http://urn.fi/URN:ISBN:978-951-51-7750-6

Optimization Algorithms for Learning Graphical Model Structures

Graphical models are a versatile machine learning framework enabling efficient and intuitive representations of probability distributions. They can be used for performing various data analysis tasks that would not be feasible otherwise. This is made possible by constructing a graph structure which encodes the underlying dependence structure of the probability distribution. To that end, the field of structure learning develops specialized algorithms which can learn a structure that describes given data well.

This thesis presents advances in structure learning for three different graphical model classes: chordal Markov networks, maximal ancestral graphs, and causal graphs with cycles and latent confounders. Learning structures for these model classes has turned out to be a very challenging task, with the few existing exact methods scaling to much fewer number of random variables compared to more extensively developed methods for Bayesian networks.

Chordal Markov networks are a central class of undirected graphical models. Being equivalent to so-called decomposable models, they are essentially a special case of Bayesian networks. This thesis presents an exact branch-and-bound algorithm and an in-exact stochastic local search for learning chordal Markov network structures. Empirically we show that the branch and bound is at times able to learn provably optimal structures with higher number of variables than competing methods, whereas the local search scales considerably further.

Maximal ancestral graphs are a generalization of Bayesian networks which allow for representing the influence of unobserved variables. This thesis introduces the first exact search algorithm for learning maximal ancestral graphs and a framework for generating and pruning local scores for the search. Empirically we show that the proposed exact search is able to learn higher-quality structures than existing in-exact methods.

Finally, we introduce an exact branch-and-bound algorithm for learning causal graphs in the presence of cycles and latent confounders. Our empirical results show that the presented algorithm is able to learn optimal structures considerably faster than a recent exact method for the problem. We also extend our branch and bound to support interventional data and σ-separation, and show empirically that the algorithm can handle higher number of experimental datasets than the only competing method supporting σ-separation.