Bidual Lagrangians in ML tasks
http://arxiv.org/abs/1201.3674
Recent results in Compressive Sensing have shown that, under certain
conditions, the solution to an underdetermined system of linear equations with
sparsity-based regularization can be accurately recovered by solving convex
relaxations of the original problem. In this work, we present a novel
primal-dual analysis on a class of sparsity minimization problems. We show that
the Lagrangian bidual (i.e., the Lagrangian dual of the Lagrangian dual) of the
sparsity minimization problems can be used to derive interesting convex
relaxations: the bidual of the $\ell_0$-minimization problem is the
$\ell_1$-minimization problem; and the bidual of the $\ell_{0,1}$-minimization
problem for enforcing group sparsity on structured data is the
$\ell_{1,\infty}$-minimization problem. The analysis provides a means to
compute per-instance non-trivial lower bounds on the (group) sparsity of the
desired solutions. In a real-world application, the bidual relaxation improves
the performance of a sparsity-based classification framework applied to robust
face recognition.
This unconventional equivalence relation between (P0 ) and (P1 ) and the more recent numerical
solutions [3, 16] to efficiently recover high-dimensional sparse signal have been a very competitive research area in CS. Its broad applications have included sparse error correction [6], compressive imaging [23], image denoising and restoration [11, 17], and face recognition [13, 21], to name a few.
In addition to enforcing entry-wise sparsity in a linear system of
equations, the notion of group sparsity has attracted increasing
attention in recent years [12, 13, 18].
We have presented a novel analysis of several sparsity minimization problems which allows us to
interpret several convex relaxations of the original NP-hard primal problems as being equivalent to
maximizing their Lagrangian duals. The pivotal point of this analysis is the formulation of mixed-integer programs which are equivalent to the original primal problems. While we have derived the biduals for only a few sparsity minimization problems, the same techniques can also be used to easily derive convex relaxations for other sparsity minimization problems. An interesting result of our biduality framework is the ability to compute a per-instance certificate of optimality by providing a lower bound for the primal objective function. This is in contrast to most previous research which aims to characterize either the subset of solutions or the set of conditions for perfect sparsity recovery using the convex relaxations [5, 6, 8–10, 14, 15, 20]. In most cases, the conditions are either weak or hard to verify. More importantly, these conditions needed to be pre- computed as opposed to verifying the correctness of a solution at run-time. In lieu of this, we hope that our proposed framework will prove an important step towards per-instance verification of the
solutions. Specifically, it is of interest in the future to explore tighter relaxations for the verification of the solutions.