TY - JOUR
T1 - Exact Fractional Inference via Re-Parametrization & Interpolation between Tree-Re-Weighted- and Belief Propagation- Algorithms
AU - Behjoo, Hamidreza
AU - Chertkov, Michael
N1 - Publisher Copyright:
© 2024, Transactions on Machine Learning Research. All rights reserved.
PY - 2024
Y1 - 2024
N2 - Computing the partition function, Z, of an Ising model over a graph of N “spins” is most likely exponential in N. Efficient variational methods, such as Belief Propagation (BP) and Tree Re-Weighted (TRW) algorithms, compute Z approximately by minimizing the respective (BP-or TRW-) free energy. We generalize the variational scheme by building a λ-fractional interpolation, Z(λ), where λ = 0 and λ = 1 correspond to TRW-and BPapproximations, respectively. This fractional scheme – coined Fractional Belief Propagation (FBP) – guarantees that in the attractive (ferromagnetic) case Z(T RW) ≥ Z(λ) ≥ Z(BP), and there exists a unique (“exact”) λ* such that Z = Z(λ∗). Generalizing the re-parametrization approach of (Wainwright et al., 2001) and the loop series approach of (Chertkov & Chernyak, 2006a), we show how to express Z as a product, ∀λ: Z = Z(λ) ˜Z(λ), where the multiplicative correction,˜Z(λ), is an expectation over a node-independent probability distribution built from node-wise fractional marginals. Our theoretical analysis is complemented by extensive experiments with models from Ising ensembles over planar and random graphs of medium and large sizes. Our empirical study yields a number of interesting observations, such as the ability to estimate˜Z(λ) with O(N2::4) fractional samples and suppression of variation in λ* estimates with an increase in N for instances from a particular random Ising ensemble, where [2:: 4] indicates a range from 2 to 4. We also verify and discuss the applicability of this approach to the problem of image de-noising. Based on these experiments and the theory, we conclude that both the computation of the partition function and the computation of marginals can be efficiently performed via FBP at the optimal value of λ*.
AB - Computing the partition function, Z, of an Ising model over a graph of N “spins” is most likely exponential in N. Efficient variational methods, such as Belief Propagation (BP) and Tree Re-Weighted (TRW) algorithms, compute Z approximately by minimizing the respective (BP-or TRW-) free energy. We generalize the variational scheme by building a λ-fractional interpolation, Z(λ), where λ = 0 and λ = 1 correspond to TRW-and BPapproximations, respectively. This fractional scheme – coined Fractional Belief Propagation (FBP) – guarantees that in the attractive (ferromagnetic) case Z(T RW) ≥ Z(λ) ≥ Z(BP), and there exists a unique (“exact”) λ* such that Z = Z(λ∗). Generalizing the re-parametrization approach of (Wainwright et al., 2001) and the loop series approach of (Chertkov & Chernyak, 2006a), we show how to express Z as a product, ∀λ: Z = Z(λ) ˜Z(λ), where the multiplicative correction,˜Z(λ), is an expectation over a node-independent probability distribution built from node-wise fractional marginals. Our theoretical analysis is complemented by extensive experiments with models from Ising ensembles over planar and random graphs of medium and large sizes. Our empirical study yields a number of interesting observations, such as the ability to estimate˜Z(λ) with O(N2::4) fractional samples and suppression of variation in λ* estimates with an increase in N for instances from a particular random Ising ensemble, where [2:: 4] indicates a range from 2 to 4. We also verify and discuss the applicability of this approach to the problem of image de-noising. Based on these experiments and the theory, we conclude that both the computation of the partition function and the computation of marginals can be efficiently performed via FBP at the optimal value of λ*.
UR - http://www.scopus.com/inward/record.url?scp=85219505107&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85219505107&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:85219505107
SN - 2835-8856
VL - 2024
JO - Transactions on Machine Learning Research
JF - Transactions on Machine Learning Research
ER -