18
$\begingroup$

I would like to start using Python for modelling and solving optimization problems. I would like to use both single-objective problems and multi-objective problems with a multidimensional objective space. For the multiobjective problems I'd like to use a metaheuristic, something like multiobjective evolutionary algorithms (like NSGA-2) for solving it.

Now my question is, which Python package for OR is suitable for doing this? Can I for example use something like:

  • Pyomo
  • Pulp
  • Pyopt

I'd appreciate every comment and I'd be quite thankful for your help.

Update: Here is a more detailed desciption of what I intend to do. Basically I have a multiobjective optimization problem (mixed-integer linear program) with 2 objectives and I would like to compare three methods in different sceanrios with varying complexity:

  1. Weighted sum approach solved by an exaxt algorithm (e.g. using a commerical solver like CPLEX)
  2. Weightes sum approach solved by a single-objetice metaheuristic (like conventional evolutionary algorithms or particle swarm optimization)
  3. Real multiobjectice optimization with a metaheuristic (like NSGA-2 or MOPSO)

I'd like to do this all in Python, as I read here in the forum that Python is strongly used in the OR community. Which packages would you advice me to use?

Additional note: With real multiobjective optimization I mean, not to use a weighted sum approach (and thus convert the objective space into a one-dimensional space) but to have a multidimensional objective space and try to find the Pareto optimal solutions (e.g. with NSGA-2 which is a 'real' multiobjective optimization metaheuristic)

$\endgroup$
7
  • $\begingroup$ If you don't find that the answers by @hasson or me adequately address your concern, I suggest you follow my suggestion to edit the question to provide explicit details as to what you do and don't want. $\endgroup$ Commented Aug 14, 2020 at 16:24
  • $\begingroup$ @MarkL.Stone Thanks Mark for your comments. I just did what you told me. $\endgroup$
    – PeterBe
    Commented Aug 17, 2020 at 14:59
  • $\begingroup$ I suggest you explain what "Real multiobjective optimization" is. $\endgroup$ Commented Aug 17, 2020 at 16:05
  • $\begingroup$ Hi Mark, thanks for your comment. I am sorry for imprecise term. With real multiobjective optimization I mean, not to use a weighted sum approach (and thus convert the objective space into a one-dimensional space) but to have a multidimensional objective space and try to find the Pareto optimal solutions (e.g. with NSGA-2 which is a 'real' multiobjective optimization metaheuristic). $\endgroup$
    – PeterBe
    Commented Aug 17, 2020 at 16:22
  • 2
    $\begingroup$ I suggest you edit this enhanced explanation into the question. $\endgroup$ Commented Aug 17, 2020 at 16:39

5 Answers 5

17
+50
$\begingroup$

If you use packages like PyOMO, PuLP or pyOpt, you'd have to implement all the operations for multiobjective optimization - e.g. to find nondominated solutions or the different mutation operators - that could take some time. An alternative is using DEAP for that, it's a Python framework for evolutionary algorithm and they have NSGA-II implemented. It's quite customizable and you can also easily interact with other Python libraries in the routines (e.g. for mutation and crossover operations). A second library is jMetalPy, which has a broad scope with more multiobjective optimization algorithms implemented (DEAP is focused on evolutionary algorithms).

A second alternative is to model some objectives as a budget constraint and use pyomo, pulp, etc, with a varying parameter for that constraint's bound. In the end you'll have found a set of optimal solutions and will be able approximate the nondominated (Pareto) front. There are also some LP- and MIP-specific multiobjective optimization algorithms in the literature. See for example this this GitHub project which is compatible with Julia

Other alternatives, like taking a linear combination of objectives, are contained in Mark's answer.


To answer the updated question: OP wants to compare three methods for multiobjective mixed-integer linear program with 2 objectives, in different scenarios with varying complexity, using Python:

  1. Weighted sum approach solved by an exact algorithm
  2. Weighted sum approach solved by a single-objetive metaheuristic
  3. Multiobjective optimization with a metaheuristic (like NSGA-2 or MOPSO), having a multidimensional objective space and trying to find the Pareto optimal solutions.

I recommend the following for each scenario:

For the weighted sum approach, use PyOMO. This way you'll dominate a Python module that allows you to interact with either Gurobi, CPLEX, GLPK, CBC, Mosek, BARON, among other solvers, allowing to be more tool-agnostic than if you worked with a specific software's API. Moreover, there's GAMS/PYOMO which allows users to solve GAMS models using solvers within the PyOMO modeling system. This can be useful as you stated having used GAMS in the past.

For scenarios 2. and 3., you can use jMetalPy which has several kinds of algorithms implemented for single-objective (Evolution Strategy, Genetic Algorithm, Local Search, Simulated annealing) and many more for multi-objective: 8 Evolutionary Algorithms (GDE3, HYPE, IBEA, MOCell, MOEA/D, NSGA-II, NSGA-III, SPEA2) and 2 PSO Algorithms (OMOPSO, SMPSO). This way, you'll learn only one library that can get you a whole variety of algorithms and tests available.

$\endgroup$
25
  • $\begingroup$ Thanks dhasson for your answer. My main goal is to use Python for optimization as I heard that it is strongly used in industry. So basically I would like to use a general python package for optimization and (later) use multiobjective optimization approaches. So I would like to also use normal (one-dimensional) solvers like CPLEX for the optimization. $\endgroup$
    – PeterBe
    Commented Aug 12, 2020 at 13:00
  • $\begingroup$ Yes of course, Python is widely used in industry and you can use it to learn and implement that stuff.. Nonetheless, if you need to create a high-performant system in the future, you might want to reimplement the Python solution/prototype, or parts of it, in a low level language like C++. This will depend on several factors. $\endgroup$
    – dhasson
    Commented Aug 12, 2020 at 13:30
  • $\begingroup$ Thanks dhasson for your answer. But for me there is still the question if I should use Pyomo, Pulp, Pyopt or Deap (as you suggested). I would like to do normal optimizaiton and multiobjective optimization. I also found jmetalPy (github.com/vOptSolver/vOptSolver). I know jMetal from Java. So which of those is the most widely used optimization package for Python when it comes to normal and multiobjective optimization? $\endgroup$
    – PeterBe
    Commented Aug 12, 2020 at 16:02
  • 1
    $\begingroup$ @PeterBe, Ok, That's right. If you are interested in Python, one of the best choices (I think) is Pyomo. As you have experienced in the GAMS language, it has near syntax with it. Another reason is, it can be connected to many of the different solvers by NEOS interface in which solves the broad range of the optimization problems. As per it is python-based, you might combine it with lots of useful packages to implement what you are trying to do. $\endgroup$
    – A.Omidi
    Commented Aug 18, 2020 at 19:14
  • 1
    $\begingroup$ I would stay with pyomo and jmetalpy, as it makes it simpler to extend for other available metaheuristics (e.g. simulated annealing) if I wanted to extend the scope in the future. Moreover, if I decide to work with other metaheuristics in the future (for another project), I'd know a framework with a larger variety of algorithms available (with respect to deap). $\endgroup$
    – dhasson
    Commented Aug 21, 2020 at 14:53
10
$\begingroup$

If @dbasson 's excellent answer is not what you're looking for, may I suggest the possibility of using multiobjective optimization capabilities in CPLEX or Gurobi (under Python)?


CPLEX

New multiobjective optimization features in CPLEX V12.9.0

Optimization problems with multiple linear objective functions can be specified in CPLEX. To solve them, CPLEX offers a mixture of blended and lexicographic (or hierarchical) optimization.

A blended objective consists of simply the linear combination of several objectives with given weights.

A lexicographic objective supposes that an order has been given among the various objective functions. This order allows you to define a lexicographic order among solutions: a solution is lexicographically smaller than another one if, in the first objective where they differ (following the order), it is smaller. An optimal solution is then one that is lexicographically minimal (or maximal depending on the optimization sense).

CPLEX can combine both blended and lexicographic objectives in the same optimization problem.


Gurobi

Gurobi: Working With Multiple Objective

<Edited version follows. Skips examples and some other material.>

Blended Objectives A blending approach creates a single objective by taking a linear combination of your objectives. You provide a weight for each objective as an argument to setObjectiveN. Alternatively, you can use the ObjNWeight attribute, together with ObjNumber.

Hierarchical Objectives A hierarchical or lexicographic approach assigns a priority to each objective, and optimizes for the objectives in decreasing priority order. At each step, it finds the best solution for the current objective, but only from among those that would not degrade the solution quality for higher-priority objectives. You provide the priority for each objective as an argument to setObjectiveN. Alternatively, you can use the ObjNPriority attribute. Priorities are integral, not continuous. Larger values indicate higher priorities. The default priority for an objective is 0.

Multiple-Objective Degradation By default, our hierarchical approach won't allow later objectives to degrade earlier objectives, subject to the user-given ending gap conditions for the optimization problem. This behavior can be relaxed for MIPs through a pair of tolerances: a relative and an absolute tolerance. These are provided as arguments to setObjectiveN, or they can be set using attributes ObjNRelTol and ObjNAbsTol. By setting one of these for a particular objective, you can indicate that later objectives are allowed to degrade this objective by the specified relative or absolute amount, respectively. Objective degradations are handled differently for multi-objective LP models. For LP models, solution quality for higher-priority objectives is maintained by fixing some variables to their values in previous optimal solutions. These fixings are decided using variable reduced costs. The value of the ObjNAbsTol parameter indicates the amount by which a fixed variable's reduced cost is allowed to violate dual feasibility, whereas the ObjNRelTol parameter is simply ignored. If you want the MIP behavior, where the degradation is controlled more directly, you can add a dummy binary variable to the model, thus transforming it into a MIP. Solving the resulting multi-objective MIP will be much more time consuming than solving the original multi-objective LP.

Combining Blended and Hierarchical Objectives Actually, both weight and priority are always specified for each objective. This allows you to seamlessly combine the blended and hierarchical approaches. To understand how this works, we should first provide more detail on how hierarchical objectives are handled. When you specify a different priority for each of objectives, the solver performs separate optimization steps. In each step, in decreasing priority order, it optimizes for the current objective multiplied by its ObjNWeight attribute, while imposing constraints that ensure that the quality of higher-priority objectives isn't degraded by more than the specified tolerances

Multiple objective values can be queried programmatically in all our APIs The basic notion is that you have to specify for which multi objective you want to query information (by setting the parameter ObjNumber). Furthermore, you can also specify for which solution you want to query this information (by setting the parameter SolutionNumber.

$\endgroup$
3
  • 1
    $\begingroup$ Thanks Mark for your answer. Basically I would like to have a real multiobjective problem with a multidimensional objective space. As far as I understand using the standard solvers like CPLEX or Gurobi transforms the objetivce space into a one-dimensional space (e.g. by using the weighted sum approach) and thus neglecting the multiobjective nature of the problem. By doing so you can't get all points of the solution space into the pareto front. $\endgroup$
    – PeterBe
    Commented Aug 12, 2020 at 16:48
  • 1
    $\begingroup$ Please read the full documentation available at links within the links I provided. Even in the condensed versions I showed, the documentation shows that CPLEX and Gurobi both provide options beyond the transformation you describe. Nevertheless, yes, there are formulation and solution options beyond what CPLEX and Gurobi offer in their builtin capabilities. if after reading all the documentation, you find they don't do what you want, then you should edit your question to provide more details as to what you do or don''t want. Multiobjective optimization means different things to different people $\endgroup$ Commented Aug 12, 2020 at 18:05
  • 1
    $\begingroup$ Nice answer, Mark. I knew only about Gurobi's capability, great to know that multi-objective is also possible with CPLEX. And I agree, there's no general one-size-fits-all solution. $\endgroup$
    – dhasson
    Commented Aug 12, 2020 at 18:45
6
$\begingroup$

The vOptGeneric (https://github.com/vOptSolver/vOptGeneric.jl) package of the vOptSolver includes the primitives for solving 2-objectives IP with weighted sum method, epsilon-constraint method and also Chalmet method. You can select GLPK, CPLEX or GUROBI as MIP solver (only one line to set up). vOptGeneric is implemented in Julia (https://julialang.org/) and comes with JuMP (algebraic modeling language). The code is compliant with the last version of Julia and JuMP. I am currently updating the documentation.

About MOMH, jMetal (java or C++ or now python) fits with your needs.

$\endgroup$
2
$\begingroup$

You should probably also consider Optuna, a fairly new but rapidly improving optimization framework. It was originally targeted at machine learning hyperparameter optimization, but it works for all kinds of problems and provides generic algorithms (Gaussian processes, NSGA-II, TPE estimators, etc.).

$\endgroup$
2
$\begingroup$

Just adding two more options. Pymoo is a good option, it has several algorithms and functionality for creating your own. And Pysamoo is the version for surrogated (multiobjective) optimization.

$\endgroup$
2
  • $\begingroup$ Thanks Enrique for your answer (I upvoted it). Can you roughly estimate which of the frameworks you mentioned and which are mentioned by others for multi objective optimization has the largest community such that I could ask questions (maybe even at stackoverflow) if I have a problem? $\endgroup$
    – PeterBe
    Commented Dec 7, 2022 at 17:23
  • $\begingroup$ Both are very related. Except in the case of your objective function was a simulation, I will ask questions for the first one. $\endgroup$ Commented Dec 8, 2022 at 9:49

Not the answer you're looking for? Browse other questions tagged or ask your own question.