0
$\begingroup$

Consider the linear programming problem \begin{align} f^* = \max_{x}&~p^Tx~\\~&A^Tx\leq b~,~0\leq x_i\leq 1 \end{align}where $p$ is a $n\times 1 $ vector, $A$ is a $n\times c$ matrix and $b$ is a $c \times 1$vector. Here $x=[x_1,\dots,x_n]$ is the $n \times 1$ vector to be found. Thus, in addition to the box constraint $0\leq x_i\leq 1$, we have $c$ constraints. It is also known that $p,A,b$ are all element-wise positive.

I randomly generated $p,A,b$ such that each entry is i.i.d and is uniformly distributed in the interval $[0,1]$. Note that this meets the problem set-up. I observed that whenever the problem was feasible, the optimal solution $x^*=[x_1^*,\dots,x_n^*]$ had a interesting property. The number of fractional $x_i^*$ (i.e. $0<x_i^*<1$) were atmost $c$. Every other $x_i^*$ belonged to $\{0,1\}$ (accounting for rounding). Is there anything going on here?

$\endgroup$

1 Answer 1

4
$\begingroup$

Yes, this is a known consequence of the fact that there always exists an optimal solution that is basic (an extreme point of the feasible region). This property is the foundation of the simplex method, which moves from one basis to another in each iteration.

$\endgroup$
2
  • $\begingroup$ Can you please point me to a proof. FWIK, simplex method should work regardless of randomness in $p,A,b$ as long as it is feasible. Are you suggesting that, whenever it is feasible, solution is going to obey this property? $\endgroup$ Commented Apr 26, 2020 at 2:50
  • $\begingroup$ There is probably a proof in Chvatal's Linear Programming (1983). Yes, the property holds for problems that have an optimal solution (feasible and not unbounded). $\endgroup$
    – RobPratt
    Commented Apr 26, 2020 at 16:07

Not the answer you're looking for? Browse other questions tagged or ask your own question.