7
$\begingroup$

I have a fixed number of bins which are themselves weightless. Each bin can hold only a fixed amount of weight. Not all bins have the same capacity.

I also have a fixed number of objects each of which has a weight. Not all objects have the same weight.

I would like to place objects in the bins in a way which respects their capacity while minimizing the weight of the heaviest bin.

What is the name of this problem?

(If the capacities were infinite, this would reduce to multi-way number partitioning.

With one bin and a low number of unique weights this looks like a cutting stock problem.)

$\endgroup$

4 Answers 4

10
$\begingroup$

Let $w_o$ denote the weight of object $o$, and let $c_b$ denote the capacity of bin $b$.

You can interpret this as a job shop scheduling problem. The correspondence is that each object is a job, with duration $w_o$, each bin is a machine that is available for only $c_b$ time units, and $z$ is the makespan.

It is also a special case of the bottleneck generalized assignment problem where the costs depend only on the tasks (objects) and not the agents (bins).

You can solve the problem via mixed integer linear programming as follows. Let binary variable $x_{o,b}$ indicate whether object $o$ is assigned to bin $b$. The problem is to minimize $z$ subject to \begin{align} \sum_b x_{o,b} &= 1 &&\text{for all $o$} \tag1\\ \sum_o w_o x_{o,b} &\le c_b &&\text{for all $b$} \tag2\\ \sum_o w_o x_{o,b} &\le z &&\text{for all $b$} \tag3\\ \end{align} Constraint $(1)$ assigns each object to exactly one bin. Constraint $(2)$ enforces the bin capacity. Constraint $(3)$ enforces the minimax objective.


Sparse formulation: introduce bin variable $y_b$ to represent the common LHS of $(1)$ and $(2)$, and minimize $z$ subject to \begin{align} \sum_b x_{o,b} &= 1 &&\text{for all $o$} \\ \sum_o w_o x_{o,b} &= y_b &&\text{for all $b$} \\ y_b &\le c_b &&\text{for all $b$} \\ y_b &\le z &&\text{for all $b$} \\ \end{align}


If many items have the same weight, you can exploit symmetry and reduce the problem size by letting $n_t$ be the number of objects of type $t$ (with weight $w_t$) and treating $x_{t,b}$ as an integer variable that represents the number of objects of type $t$ that are assigned to bin $b$. The problem is to minimize $z$ subject to \begin{align} \sum_b x_{t,b} &= n_t &&\text{for all $t$} \\ \sum_t w_t x_{t,b} &= y_b &&\text{for all $b$} \\ y_b &\le c_b &&\text{for all $b$} \\ y_b &\le z &&\text{for all $b$} \\ \end{align}

$\endgroup$
12
  • 4
    $\begingroup$ It might help to introduce a new variable and constraint to represent the common LHS of (2) and (3). That will make the problem sparser and avoid nearly parallel constraints. $\endgroup$
    – RobPratt
    Commented Feb 28, 2021 at 3:18
  • 1
    $\begingroup$ Do you have lots of duplications of weights and/or capacities? If so, there is an opportunity to exploit symmetry. How many objects and bins do you have? Are you able to share your data? $\endgroup$
    – RobPratt
    Commented Feb 28, 2021 at 3:22
  • 1
    $\begingroup$ @RobPratt, Would you say please, how this problem might be interpreted as a JSP? In the BPP, the items can be stored in the bins in an arbitrary sorting list but, in JSP, each whole job and its related operations must be performed in the specific route which is pre-determined. Also, is it possible to explain how to represent the common LHS of (2) and (3) by introducing a new variable and constraint? $\endgroup$
    – A.Omidi
    Commented Feb 28, 2021 at 7:14
  • 1
    $\begingroup$ @A.Omidi This variant of job shop where the order is arbitrary is called open shop. For the common LHS, I suggest introducing new variable $y_b$ and replacing constraints (2) and (3) with $y_b=\sum_o w_o x_{o,b}$, $y_b \le c_b$, $y_b \le z$. $\endgroup$
    – RobPratt
    Commented Feb 28, 2021 at 14:21
  • 2
    $\begingroup$ @Richard If your solver does not automatically break symmetry with respect to the bins (via orbital branching or otherwise), you can try imposing explicit symmetry-breaking constraints $y_b \ge y_{b+1}$. More generally, you would impose these only within each family of bins that have the same capacity. $\endgroup$
    – RobPratt
    Commented Feb 28, 2021 at 17:48
6
$\begingroup$

With a low to moderate number of distinct weights, this is a variant of a transportation problem. An appropriate model would be $$\begin{alignat*}{1} \min\ \ & w_{\max}\\ \text{s.t.}\ \ \ & \sum_{j=1}^{J}x_{ij}=n_{i}\ \ \ i=1,\dots,I\\ & \sum_{i=1}^{I}a_{i}x_{ij}=w_{j}\ \ \ j=1,\dots,J\\ & w_{j}\le w_{\max}\ \ \ j=1,\dots,J\\ & 0\le w_{j}\le c_{j}\ \ \ j=1,\dots,J\\ & x_{ij}\in\mathbb{Z}_{\ge0}\ \ \ \forall i,j \end{alignat*}$$where $I$ and $J$ are the number of distinct weight classes and number of bins respectively, $n_i$ is the number of items with weight $a_i$, and $c_j$ is the capacity of bin $j$.

Using counts within a weight class assigned to each bin rather than indicators for individual items will shrink the model size and get rid of one source of symmetry (invariance of the solution with respect to permuting indices of items with identical weights). The other potential source of symmetry is invariance of the solution with respect to permuting indices of bins with identical capacity. If that happens, you can add constraints to eliminate the bin symmetry. Assume that bins are already sorted in ascending capacity order ($c_1 \le \dots \le c_J$). For each group of bins $j_0, \dots, j_1$ with identical capacity, add constraints requiring $w_{j_0} \le w_{j_{0}+1} \le \dots \le w_{j_1}$. The idea is to fill identical bins in nondecreasing order of the weight dumped into each bin.

I ran the example you posted as a comment to Rob's solution (64 bins, no weight limits on a bin). The optimal solution had a maximum weight of 1280 for any bin (occurring in 41 of the 64 bins). CPLEX 20 solved it to optimality in something like 0.04 seconds.

$\endgroup$
3
  • 1
    $\begingroup$ many thanks for explanation of the symmetry breaking. $\endgroup$
    – A.Omidi
    Commented Feb 28, 2021 at 20:03
  • $\begingroup$ Interestingly, using CBC as a solver and employing the symmetry breaking you describe while removing the greedy numbering heuristic I originally employed takes time-to-solution from 0.39s to 8s and the problem can only be solved to "optimal inaccurate" status. Keeping both the heuristic and the symmetry breaking restores the good timing. This might be an artefact of CBC. Still, using symmetry breaking provides better performance and more solutions than not. $\endgroup$
    – Richard
    Commented Mar 2, 2021 at 3:42
  • $\begingroup$ Like pretty much everything else related to MIP models, symmetry-breaking is subject to the whims of the algorithmic gods. It can make the solution time shorter (possibly much shorter) or longer. In my experience, it helps more often than it hurts, but it does sometimes hurt. $\endgroup$
    – prubin
    Commented Mar 3, 2021 at 16:52
4
$\begingroup$

Your problem can be modeled differently using a set-based approach instead of the classical Boolean approach proposed above. This modeling approach offered by Hexaly is different from traditional MILP solvers. Note that Hexaly is commercial software. Nevertheless, it is free for faculty and students.

Below is the code snippet to model your bin packing problem by using Hexaly Modeler:

/* Your model written in LSP. */
function model() {
  // Set decisions: bins[k] represents the items in bin k
  bins[k in 0..nbMaxBins-1] <- set(nbItems);

  // Each item must be in one bin and one bin only
  constraint partition[k in 0..nbMaxBins-1](bins[k]);

  for [k in 0..nbMaxBins-1] {
    // Weight constraint for each bin
    binWeights[k] <- sum(bins[k], i => itemWeights[i]);
    constraint binWeights[k] <= binCapacities[k];
  }

  // Compute the weight of the heaviest bin
  heaviestBin <- max[k in 0..nbMaxBins-1](binWeights[k]);

  // Minimize the weight of the heaviest bin
  minimize heaviestBin;
}

You can write the same model in Python, Java, C#, or C++. For that, you can inspire from the original bin packing example described here.

Having followed this modeling approach, Hexaly finds quality solutions (optimality gap less than 1%) in a few minutes on a standard computer for instances with thousands of items to pack. For example, look at this benchmark. Hexaly embeds and combines innovative exact and heuristic methods under the hood to get such results.

$\endgroup$
3
$\begingroup$

within CPLEX CPOptimizer you could rely on the pack constraint.

A slight change of the model in the documentation:

using CP;
int m = 2;
int n = 3;

dvar int l[j in 1..m] in 0..10000;
dvar int p[i in 1..n] in 1..m;
dvar int nb;

int w[1..n] = [i : 1 | i in 1..n];  

// minimizing the weight of the heaviest bin.
minimize max(i in 1..m) l[i];
subject to {
   
  pack(l, p, w, nb); 
  
}

assert nb==m-count(l,0);

which gives

l = [1 2];
p = [1 2 2];
nb = 2;
$\endgroup$

Not the answer you're looking for? Browse other questions tagged or ask your own question.