5
$\begingroup$

I could not find the answer on the Internet. The case of quadratic programming with constraints is already solved on this forum, see Transforming SAT to Quadratic Programming in polynomial time. But what happens if there are no constraints or if the only acceptable constraint is that the variables are non-negative?

$\endgroup$

1 Answer 1

4
$\begingroup$

1-IN-3SAT is an NP-complete variant of 3SAT in which the goal is to determine whether some assignment satisfies exactly one literal per clause. We can convert an instance of 1-IN-3SAT (even without the restriction on the number of literals per clause) to an equivalent instance of quadratic programming in which all the variables are constrained to be non-negative as follows:

  • For each variable $x_i$, the quadratic program has two variables $x_i^+,x_i^-$, constrained to be non-negative.

  • The objective is to minimize

$$ \sum_i (x_i^+ + x_i^- - 1)^2 + x_i^+ x_i^- + \sum_j (C_j - 1)^2, $$ where $i$ goes over all variables, $j$ goes over all clauses, and $C_j$ is the sum of all literals in $C_j$: for example, if the $j$th clause is $x_1 \lor \bar{x}_2 \lor x_3$ then $C_j = x_1 + (1-x_2) + x_3$.

By construction the objective function is always non-negative. The first two summands are zero if the $x_i^\pm$ encode an assignment to the original variables, and the third is zero is this assignment satisfies exactly one literal per clause.


Without any constraints, the problem becomes easy. The law of inertia (which can be implemented efficiently) shows that up to a linear change of basis, we can assume that the quadratic to be minimized is $$ \sum_{i \in A} z_i^2 - \sum_{j \in B} z_j^2 + 2\sum_k c_k z_k + M, $$ where $A,B$ are disjoint. If $B$ is non-empty then the quadratic form is unbounded from below, so let us assume that $B = \emptyset$. If $c_k \neq 0$ for some $k \notin A$, then the quadratic form is also unbounded from below, so let us assume that the $c_k$ are supported on $A$. Therefore the quadratic is equal to $$ \sum_{i \in A} [(z_i + c_i)^2 - c_i^2] + M, $$ showing that the minimum is $M - \sum_i c_i^2$.


A third variant that can be considered is a homogeneous objective function with non-negativity constraints. This variant is also easy.

The law of inertia transforms the problem into that of minimizing $$ \sum_{i \in A} z_i^2 - \sum_{j \in B} z_j^2 $$ under some arbitrary linear constraints.

Using linear algebra, we can check whether the linear constraints force $z_j = 0$ for all $j \in B$. If not, then the quadratic form is unbounded from below, so let us assume it is the case. Our quadratic form is thus positive semidefinite, and so convex optimization algorithms can be used to solve it (see Wikipedia).

$\endgroup$
1
  • 1
    $\begingroup$ The way you enforce $\{x_i^+,x_i^-\}=\{0,1\}$ at optimality is a neat trick. You can also use it to obtain a reduction from the (bi-)partition problem for elements $a_1,a_2,\dots a_n$, i.e., one can solve bi-partition by minimizing $\sum_i (x_i^+ + x_i^- - 1)^2 + x_i^+ x_i^- + ( \sum_i x_ia_i - \frac 12 \sum_i a_i)^2$. $\endgroup$ Commented Oct 8, 2019 at 14:10

Not the answer you're looking for? Browse other questions tagged or ask your own question.