Skip to main content Skip to main navigation

Publication

Lifted Inference for Convex Quadratic Programs

Martin Mladenov; Leonard Kleinhans; Kristian Kersting
In: Satinder Singh; Shaul Markovitch (Hrsg.). Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence. AAAI Conference on Artificial Intelligence (AAAI-2017), February 4-9, San Francisco, California, USA, Pages 2350-2356, AAAI Press, 2017.

Abstract

Symmetry is the essential element of lifted inferencethat has recently demonstrated the possibility to perform very efficient inference in highly-connected, but symmetric probabilistic models. This raises the question, whether this holds for optimization problems in general. Here we show that for a large classof optimization methods this is actually the case. Specifically, we introduce the concept of fractionalsymmetries of convex quadratic programs (QPs), which lie at the heart of many AI and machine learning approaches, and exploit it to lift, ie, to compress QPs. These lifted QPs can then be tackled with the usual optimization toolbox (off-the-shelf solvers, cutting plane algorithms, stochastic gradients etc.). If the original QP exhibitssymmetry, then the lifted one will generallybe more compact, and hence more efficient to solve.

Weitere Links