Abstract
Probabilistic graphical models are a key tool in machine learning applications. Computing the partition function, i.e., normalizing constant, is a fundamental task of statistical inference but it is generally computationally intractable, leading to extensive study of approximation methods. Iterative variational methods are a popular and successful family of approaches. However, even state of the art variational methods can return poor results or fail to converge on difficult instances. In this paper, we instead consider computing the partition function via sequential summation over variables. We develop robust approximate algorithms by combining ideas from mini-bucket elimination with tensor network and renormalization group methods from statistical physics. The resulting “convergence-free” methods show good empirical performance on both synthetic and real-world benchmark models, even for difficult instances.
| Original language | English (US) |
|---|---|
| Pages (from-to) | 109-118 |
| Number of pages | 10 |
| Journal | Proceedings of Machine Learning Research |
| Volume | 80 |
| DOIs | |
| State | Published - 2018 |
| Externally published | Yes |
| Event | 35th International Conference on Machine Learning, ICML 2018 - Stockholm, Sweden Duration: Jul 10 2018 → Jul 15 2018 |
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Statistics and Probability
- Artificial Intelligence