Supported Regression models¶
The package currently support various scikit-learn objects. It also support Gradient Boosting Regreesion from XGboost and has limited support for Keras and PyTorch. Only sequential neural networks with ReLU activation function are currently supported. In Mixed Integer Formulations, we briefly outline the formulations used for the various regression models.
The versions of those packages tested with the current version (1.5.3) are listed in the table Supported packages with version 1.5.3.
Scikit-learn¶
The following tables list the name of the models supported, the name of the corresponding object in the Python framework, and the function that can be used to insert it in a Gurobi model.
Regression Model |
Scikit-learn object |
Function to insert |
---|---|---|
Ordinary Least Square |
||
Partial Least Square |
||
Logistic Regression [1] |
||
Neural-network [2] |
||
Decision tree |
||
Gradient boosting |
||
Random Forest |
Keras¶
Keras neural networks are generated either using the functional API, subclassing model or the Sequential class.
They can be formulated in a Gurobi model with the function
add_keras_constr
.
Currently, only two types of layers are supported:
Dense layers (possibly with relu activation),
ReLU layers with default settings.
PyTorch¶
In PyTorch, only objects are supported.
They can be formulated in a Gurobi model with the function
add_sequential_constr
.
Currently, only two types of layers are supported:
,
.
XGBoost¶
XGboost’s xgboost.Booster
can be formulated in a Gurobi model
with the function add_xgboost_regressor_constr
.
The scikit-learn wrapper xgboost.XGBRegressor
can be formulated
using add_xgbregressor_constr
.
Currently only “gbtree” boosters are supported.
LightGBM¶
LightGBM’s lightgbm.Booster
can be formulated in a Gurobi model
with the function add_lgbm_booster_constr
.
The scikit-learn wrapper lightgbm.sklearn.LGBMRegressor
can be formulated
using add_lgbmregressor_constr
.
Footnotes