Bibliography¶
R. Anderson, J. Huchette, W. Ma, C. Tjandraatmadja, and J. P. Vielma. Strong mixed-integer programming formulations for trained neural networks. Mathematical Programming, 183:3–39, 2020.
David Bergman, Teng Huang, Philip Brooks, Andrea Lodi, and Arvind U. Raghunathan. Janos: an integrated predictive and prescriptive modeling framework. INFORMS Journal on Computing, 34(2):807–816, 2022. arXiv:1911.09461, doi:10.1287/ijoc.2020.1023.
F. Ceccon, J. Jalving, J. Haddad, A. Thebelt, C. Tsay, C. D. Laird, and R. Misener. Omlt: optimization & machine learning toolkit. 2022. arXiv:2202.02414.
François Chollet and others. Keras. https://keras.io, 2015.
Matteo Fischetti and Jason Jo. Deep neural networks and mixed integer linear optimization. Constraints, 23(2):296–309, 2018. URL: https://doi.org/10.1007/s10601-018-9285-6, doi:10.1007/s10601-018-9285-6.
Bjarne Grimstad and Henrik Andersson. Relu networks as surrogate models in mixed-integer linear programs. Computers & Chemical Engineering, 131:106580, 2019. URL: https://www.sciencedirect.com/science/article/pii/S0098135419307203, doi:https://doi.org/10.1016/j.compchemeng.2019.106580.
Carlos A. Henao and Christos T. Maravelias. Surrogate-based superstructure optimization framework. AIChE Journal, 57(5):1216–1232, 2011. doi:https://doi.org/10.1002/aic.12341.
Jan Kronqvist, Ruth Misener, and Calvin Tsay. Between steps: intermediate relaxations between big-m and convex hull formulations. In Integration of Constraint Programming, Artificial Intelligence, and Operations Research: 18th International Conference, CPAIOR 2021, Vienna, Austria, July 5–8, 2021, Proceedings, 299–314. Berlin, Heidelberg, 2021. Springer-Verlag. URL: https://doi.org/10.1007/978-3-030-78230-6_19, doi:10.1007/978-3-030-78230-6_19.
Zhou Lu, Hongming Pu, Feicheng Wang, Zhiqiang Hu, and Liwei Wang. The expressive power of neural networks: a view from the width. In I. Guyon, U. Von Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, editors, Advances in Neural Information Processing Systems, volume 30. Curran Associates, Inc., 2017. URL: https://proceedings.neurips.cc/paper/2017/file/32cbf687880eb1674a07bf717761dd3a-Paper.pdf.
Laurens Lueg, Bjarne Grimstad, Alexander Mitsos, and Artur M. Schweidtmann. Relumip: open source tool for milp optimization of relu neural networks. 2021. URL: https://github.com/ChemEngAI/ReLU_ANN_MILP, doi:https://doi.org/10.5281/zenodo.5601907.
Donato Maragno and Holly Wiberg. Opticl: mixed-integer optimization with constraint learning. 2021. URL: https://github.com/hwiberg/OptiCL/.
Donato Maragno, Holly Wiberg, Dimitris Bertsimas, S. Ilker Birbil, Dick den Hertog, and Adejuyigbe Fajemisin. Mixed-integer optimization with constraint learning. 2021. URL: https://arxiv.org/abs/2111.04469, doi:10.48550/ARXIV.2111.04469.
F. Pedregosa, G. Varoquaux, A. Gramfort, V. Michel, B. Thirion, O. Grisel, M. Blondel, P. Prettenhofer, R. Weiss, V. Dubourg, J. Vanderplas, A. Passos, D. Cournapeau, M. Brucher, M. Perrot, and E. Duchesnay. Scikit-learn: machine learning in Python. Journal of Machine Learning Research, 12:2825–2830, 2011.
Artur M. Schweidtmann, Dominik Bongartz, and Alexander Mitsos. Optimization with trained machine learning models embedded. 2022. URL: https://arxiv.org/abs/2207.12722, doi:10.48550/ARXIV.2207.12722.
C. Tjandraatmadja, R. Anderson, J. Huchette, W. Ma, K. Patel, and J. P. Vielma. The Convex Relaxation Barrier, Revisited: Tightened Single-Neuron Relaxations for Neural Network Verification. In H. Larochelle, M. Ranzato, R. Hadsell, M.F. Balcan, and H. Lin, editors, Proceedings of the 34th Conference on Neural Information Processing Systems (NeurIPS 2020), volume 33 of Advances in Neural Information Processing Systems, 21675–21686. 2020.