ReLU#
- class gurobi_ml.modeling.neuralnet.activations.ReLU#
Bases:
object
Class to apply the ReLU activation on a neural network layer.
- Parameters:
setbounds (Bool) – Optional flag not to set bounds on the output variables.
bigm (Float) – Optional maximal value for bounds use in the formulation
- setbounds#
Optional flag not to set bounds on the output variables.
- Type:
Bool
- bigm#
Optional maximal value for bounds use in the formulation
- Type:
Float
Methods
mip_model
(layer)MIP model for ReLU activation on a layer.
- mip_model(layer)#
MIP model for ReLU activation on a layer.
- Parameters:
layer (AbstractNNLayer) – Layer to which activation is applied.