Note
Go to the end to download the full example code.
Usage Example¶
In this page, we provide a simple example of using the Gurobi Machine Learning package.
The example is entirely abstract. Its aim is only to illustrate the basic functionalities of the package in the most simple way. For some more realistic applications, please refer to the notebooks in the examples section.
Before proceeding to the example itself, we need to import a number of packages. Here, we will use Scikit-learn to train regression models. We generate random data for the regression using the make_regression function. For the regression model, we use a multi-layer perceptron regressor neural network. We import the corresponding objects.
import gurobipy as gp
import numpy as np
from sklearn.datasets import make_regression
from sklearn.metrics import mean_squared_error
from sklearn.neural_network import MLPRegressor
from gurobi_ml import add_predictor_constr
Certainly, we need gurobipy to build an optimization model and from the
gurobi_ml package we need the
add_predictor_constr
.
function. We also need numpy.
We start by building artificial data to train our regressions. To do so, we use make_regression to obtain data with 10 features.
X, y = make_regression(n_features=10, noise=1.0)
Now, create the MLPRegressor object and fit it.
nn = MLPRegressor([20] * 2, max_iter=10000, random_state=1)
nn.fit(X, y)
We now turn to the optimization model. In the spirit of adversarial machine learning examples, we use some training examples. We pick \(n\) training examples randomly. For each of the examples, we want to find an input that is in a small neighborhood of it that leads to the output that is closer to \(0\) with the regression.
Denoting by \(X^E\) our set of examples and by \(g\) the prediction function of our regression model, our optimization problem reads:
where \(X\) is a matrix of variables of dimension \(n \times 10\) (the number of examples we consider and number of features in the regression respectively), \(y\) is a vector of free (unbounded) variables and \(\delta\) a small positive constant.
First, let’s pick randomly 2 training examples using numpy, and create our gurobipy model.
n = 2
index = np.random.choice(X.shape[0], n, replace=False)
X_examples = X[index, :]
y_examples = y[index]
m = gp.Model()
Our only decision variables in this case, are the five inputs and
outputs for the regression. We use gurobipy.MVar
matrix variables
that are most convenient in this case.
The input variables have the same shape as X_examples
. Their lower
bound is X_examples - delta
and their upper bound
X_examples + delta
.
The output variables have the shape of y_examples
and are unbounded.
By default, in Gurobi variables are non-negative, we therefore need to
set an infinite lower bound.
input_vars = m.addMVar(X_examples.shape, lb=X_examples - 0.2, ub=X_examples + 0.2)
output_vars = m.addMVar(y_examples.shape, lb=-gp.GRB.INFINITY)
The constraints linking input_vars
and output_vars
can now be
added with the function
add_predictor_constr
.
Note that because of the shape of the variables this will add the 5 different constraints.
The function returns an instance of a modeling
object
that we can use later on.
pred_constr = add_predictor_constr(m, nn, input_vars, output_vars)
The method
print_stats
of the modeling object outputs the details of the regression model that
was added to the Gurobi.
Model for mlpregressor:
160 variables
82 constraints
80 general constraints
Input has shape (2, 10)
Output has shape (2, 1)
--------------------------------------------------------------------------------
Layer Output Shape Variables Constraints
Linear Quadratic General
================================================================================
dense (2, 20) 80 40 0 40 (relu)
dense0 (2, 20) 80 40 0 40 (relu)
dense1 (2, 1) 0 2 0 0
--------------------------------------------------------------------------------
To finish the model, we set the objective, and then we can optimize it.
m.setObjective(output_vars @ output_vars, gp.GRB.MINIMIZE)
m.optimize()
Gurobi Optimizer version 11.0.3 build v11.0.3rc0 (linux64 - "Ubuntu 20.04.6 LTS")
CPU model: Intel(R) Xeon(R) Platinum 8375C CPU @ 2.90GHz, instruction set [SSE2|AVX|AVX2|AVX512]
Thread count: 1 physical cores, 2 logical processors, using up to 2 threads
Optimize a model with 82 rows, 182 columns and 1322 nonzeros
Model fingerprint: 0xede59bae
Model has 2 quadratic objective terms
Model has 80 general constraints
Variable types: 182 continuous, 0 integer (0 binary)
Coefficient statistics:
Matrix range [2e-03, 2e+00]
Objective range [0e+00, 0e+00]
QObjective range [2e+00, 2e+00]
Bounds range [1e-02, 2e+00]
RHS range [1e-02, 1e+00]
Presolve added 5 rows and 0 columns
Presolve removed 0 rows and 86 columns
Presolve time: 0.01s
Presolved: 87 rows, 96 columns, 630 nonzeros
Presolved model has 1 quadratic objective terms
Variable types: 69 continuous, 27 integer (27 binary)
Root relaxation: objective 0.000000e+00, 112 iterations, 0.00 seconds (0.00 work units)
Nodes | Current Node | Objective Bounds | Work
Expl Unexpl | Obj Depth IntInf | Incumbent BestBd Gap | It/Node Time
0 0 0.00000 0 13 - 0.00000 - - 0s
H 0 0 2777.7788946 0.00000 100% - 0s
0 0 0.00000 0 13 2777.77889 0.00000 100% - 0s
0 0 0.00000 0 12 2777.77889 0.00000 100% - 0s
0 0 0.00000 0 14 2777.77889 0.00000 100% - 0s
0 0 0.00000 0 12 2777.77889 0.00000 100% - 0s
0 0 0.00000 0 15 2777.77889 0.00000 100% - 0s
H 0 0 2502.1618383 0.00000 100% - 0s
0 2 0.00000 0 15 2502.16184 0.00000 100% - 0s
H 3 2 0.0000000 0.00000 0.00% 8.7 0s
Cutting planes:
Gomory: 1
Cover: 1
Clique: 1
MIR: 14
Flow cover: 3
Relax-and-lift: 3
Explored 4 nodes (262 simplex iterations) in 0.04 seconds (0.03 work units)
Thread count was 2 (of 2 available processors)
Solution count 3: 0 2502.16 2777.78
Optimal solution found (tolerance 1.00e-04)
Best objective 0.000000000000e+00, best bound 0.000000000000e+00, gap 0.0000%
The method
get_error
is useful to check that the solution computed by Gurobi is correct with
respect to the regression model we use.
Let \((\bar X, \bar y)\) be the values of the input and output variables in the computed solution. The function returns \(g(\bar X) - y\) using the original regression object.
Normally, all values should be small and below Gurobi’s tolerances in this example.
array([[7.82707232e-15],
[9.93649607e-15]])
We can look at the computed values for the output variables and compare them with the original target values.
print("Computed values")
print(pred_constr.output_values.flatten())
Computed values
[0. 0.]
print("Original values")
print(y_examples)
Original values
[ 58.78959419 -28.74501691]
Finally, we can remove pred_constr
with the method
remove()
.
Total running time of the script: (0 minutes 2.027 seconds)