1
$\begingroup$

Let say I have to solve a large QUBO (quadratic unconstrained binary optimization) problem $$ \min_{x}{x^\top Qx}, $$ where $x\in\{0,1\}^N$ is a binary variable and $Q\in R^{N\times N}$ encodes the problem. This is in general an intractable task. Now suppose that somebody gives me the global minimizing solution for this problem (that is, the smallest possible value of ${x^\top Qx}$) but not a minimizing argument $x_{min}$. Does it somehow alleviate the task of finding an $x_{min}$?

One strategy, whose effectiveness I am not sure about, is that if I randomly sample the solutions $x$ I can efficiently check against the minimal value and it tells me how close I am to the global minimum. But I may just be exploring a suboptimal valley that can be very far from the global minimum in the landscape.

Note that since the problem is not concave there might be many $x_{min}$'s. So by solving I mean to get at least one $x_{min}$.

If it cannot help find a minimizing solution, can this knowledge speed up some sort of approximate search for close-to-optimal solutions?

$\endgroup$
3
  • $\begingroup$ If you find an x with the provided minimal cost, the information tells you that this is the global minimum and you can stop searching for better ones. This definitely alleviates the problem... $\endgroup$
    – NeitherNor
    Commented Jun 19, 2020 at 20:29
  • $\begingroup$ I know min but not the argmin, not the other way around. $\endgroup$
    – qubist
    Commented Jun 19, 2020 at 22:12
  • 1
    $\begingroup$ @qubist, Engineer meant that, given the minimal cost, you can stop as soon as you find a solution that attains that lower bound because you then know that it is optimal. $\endgroup$
    – RobPratt
    Commented Jun 19, 2020 at 22:24

2 Answers 2

2
$\begingroup$

Providing such a bound means that you can reduce the search space. In a branch-and-bound algorithm, this means that any node whose lower bound exceeds the provided minimum value can be pruned because the resulting subtree cannot contain an optimal solution.

$\endgroup$
3
  • $\begingroup$ That is good to know. $\endgroup$
    – qubist
    Commented Jun 19, 2020 at 22:14
  • $\begingroup$ Branch and bound using the quadratic objective works well if $Q$ is positive semidefinite (the objective of the QP relaxation is convex.) However, if $Q$ is indefinite then branch and bound requires solving (potentially) non-convex QP's at each node. $\endgroup$ Commented Jun 19, 2020 at 23:25
  • $\begingroup$ @BrianBorchers, you can always make $Q$ positive definite by introducing a linear part of the objective, as in Theorem 1 of Hammer and Rubin (1970). $\endgroup$
    – RobPratt
    Commented Jun 19, 2020 at 23:47
0
$\begingroup$

A standard approach is to linearize the 0-1 quadratic programming problem by introducing variables

$y_{ij}=x_{i}x_{j}$

and then solving the QUBO as a 0-1 integer linear programming problem.

If the variables are all treated as 0-1 integer variables, then this product constraint $y_{ij}=x_{i}x_{j}$ can easily be enforced by linear inequality constraints:

$2y_{ij} \leq x_{i}+x_{j}$

$y_{ij} \geq x_{i}+x_{j}-1$

If you know the optimal objective, you can enforce that as a linear equation constraint in the 0-1 integer linear programming problem and its LP relaxations. The additional constraint might dramatically reduce the solution time.

$\endgroup$
4
  • $\begingroup$ It is stronger to disaggregate $2y_{ij} \le x_i+x_j$ to $y_{ij} \le x_i$ and $y_{ij} \le x_j$. Also, explicitly adding a dense objective cut can hurt; it might be better to input the lower bound as a target. $\endgroup$
    – RobPratt
    Commented Jun 19, 2020 at 23:50
  • $\begingroup$ A lot depends on the particular problem- for example, if Q is diagonal then variables $x_{i}$ with very large positive or negative entries in Q could be fixed by preprocessing against the objective constraint. e.g. an optimal objective of -5, and $Q$ has a small number of entries in the range from -10 to +10 and a bunch of entries with absolute values that are multiples of 1000000. Preprocessing would quickly eliminate the large entries. $\endgroup$ Commented Jun 20, 2020 at 0:21
  • $\begingroup$ Clever, is there any general lesson from the complexity theory point of view? $\endgroup$
    – qubist
    Commented Jun 20, 2020 at 15:06
  • $\begingroup$ No, I wouldn't expect there to be anything useful in terms of the computational complexity- this is just something that might help. I believe the problem is still NP-Hard. $\endgroup$ Commented Jun 20, 2020 at 19:52

You must log in to answer this question.

Not the answer you're looking for? Browse other questions tagged .