Supported Regression models

The package currently support various scikit-learn objects. It also support Gradient Boosting Regreesion from XGboost and has limited support for Keras and PyTorch. Only sequential neural networks with ReLU activation function are currently supported. In Mixed Integer Formulations, we briefly outline the formulations used for the various regression models.

The versions of those packages tested with the current version (1.5.3) are listed in the table Supported packages with version 1.5.3.

Scikit-learn

The following tables list the name of the models supported, the name of the corresponding object in the Python framework, and the function that can be used to insert it in a Gurobi model.

Transformers in scikit-learn

Transformer

Scikit-learn object

Function to insert

StandardScaler

StandardScaler

add_standard_scaler_constr

Pipeline

Pipeline

add_pipeline_constr

PolynomialFeatures [3]

PolynomialFeatures

add_polynomial_features_constr

ColumnTransformer

ColumnTransformer

add_column_transformer_constr

Keras

Keras neural networks are generated either using the functional API, subclassing model or the Sequential class.

They can be formulated in a Gurobi model with the function add_keras_constr.

Currently, only two types of layers are supported:

PyTorch

In PyTorch, only objects are supported.

They can be formulated in a Gurobi model with the function add_sequential_constr.

Currently, only two types of layers are supported:

  • ,

  • .

XGBoost

XGboost’s xgboost.Booster can be formulated in a Gurobi model with the function add_xgboost_regressor_constr. The scikit-learn wrapper xgboost.XGBRegressor can be formulated using add_xgbregressor_constr.

Currently only “gbtree” boosters are supported.

LightGBM

LightGBM’s lightgbm.Booster can be formulated in a Gurobi model with the function add_lgbm_booster_constr. The scikit-learn wrapper lightgbm.sklearn.LGBMRegressor can be formulated using add_lgbmregressor_constr.

Footnotes