Publication
Informed Lifting for Message-Passing
Kristian Kersting; Youssef El Massaoudi; Fabian Hadiji; Babak Ahmadi
In: Maria Fox; David Poole (Hrsg.). Proceedings of the Twenty-Fourth AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence (AAAI-2010), July 11-15, Atlanta, Georgia, USA, Pages 1181-1186, AAAI Press, 2010.
Abstract
Lifted inference, handling whole sets of indistinguishable objects together, is critical to the effective application of probabilistic relational models to realistic real world tasks. Recently, lifted belief propagation (LBP) has been proposed as an efficient approximate solution of this inference problem. It runs a modified BP on a lifted network where nodes have been grouped together if they have—roughly speaking—identical computation trees, the tree-structured “unrolling” of the underlying graph rooted at the nodes. In many situations, this purely syntactic criterion is too pessimistic: message errors decay along paths. Intuitively, for a long chain graph with weak edge potentials, distant nodes will send and receive identical messages yet their computation trees are quite different. To overcome this, we propose iLBP, a novel, easy-to-implement, informed LBP approach that interleaves lifting and modified BP iterations. In turn, we can efficiently monitor the true BP messages sent and received in each iteration and group nodes accordingly. As our experiments show, iLBP can yield significantly faster more lifted network while not degrading performance. Above all, we show that iLBP is faster than BP when solving the problem of distributing data to a large network, an important real-world application where BP is faster than uninformed LBP.