Title: Lazy Generic Cuts: Efficient MAP inference in Higher Order MRFs
Speaker: Dinesh Khandelwal
Abstract:
LP relaxation based message passing and flow based algorithms
are two of the popular techniques for performing MAP inference in
graphical models. Generic Cuts (GC) (Arora et al. 2012) combines the two
approaches to generalize the traditional max-flow min-cut based
algorithms for binary models with higher order clique potentials. The
algorithm has been shown to be exponentially faster than the state of
the art. The time and memory complexity of Generic Cuts is linear in the
number of constraints which in turn is exponential in the clique size.
We propose a lazy version of Generic Cuts exploiting the property that
in most of such inference problems, a large fraction of the constraints
are never used during the course of the minimization. We start with a
small set of constraints (called the active constraints) which are
expected to play a role during the minimization. GC is then run with
this reduced set allowing it to be efficient in time and memory. The set
of active constraints is adaptively learnt over multiple iterations
while guaranteeing convergence to the optimal for submodular clique
potentials. Our experiments show that the number of constraints required
by the algorithm is typically less than 3% of the total constraints. Our
algorithm requires time and memory linear in the number of such
constraints `actually' used.