Journée Axe PMNL
Onglets principaux
Informations pratiques
- Inscription gratuite, mais obbligatoire, ici : https://framadate.org/EhHCgHPvAIBXvuwa
- Lieu : Laboratoire d'informatique de l'X (LIX), Campus de l'École Polytechnique, 1, rue Honoré d'Estienne d'Orves (Bâtiment Alan Turing), 91120 Palaiseau
- Salle : TBA
- Comment venir : https://www.lix.polytechnique.fr/comment-se-rendre-au-lix/
Exposés invités
-
Renaud Chicoisne (U. Clermont-Ferrand, France)
Title: On partial convexifications of quadratically constrained sets containing indicator variables
Abstract: We study a set S defined by the epigraph of a strictly convex quadratic function with indicators on the continuous variables.
More formally, a (2n+1) dimensional tuple (x,z,t) belongs to S if
t >= x’Qx with Q positive definite (PD)
each of the n binary components of z indicates if the corresponding component of x is nonzero (on-off constraints)
For any subset of {1,…,n} of cardinality q, we derivate a closed form expression for the convex hull of S when keeping only q of the on-off constraints. We show that it reduces to extract a certain Schur complement from the PD matrix Q, and then convexify a lower dimensional set S’.
We show that this smaller convex hull can be rewritten in an extended space with a single SDP constraint of size (q+1)x(q+1).
When q=n, we recover a recent result of Wei et al. , Math. Prog. 204(1), 703-737 (2024). In their work, they also show that the optimization over S reduces to a *compact* mixed integer *linear* optimization problem: We show that our approach can generate strong cuts for that linear reformulation.
-
Maïssa Daadaa (École Polytechnique, France)
-
Florian Fontan (Artelys, France)
-
Bissan Ghaddar (Ivey Business School, Canada):
Title: Learning for Nonlinear Optimization Problems
Abstract: The use of machine learning techniques to improve the performance of branch-and-bound optimization algorithms is a very active area in the context of mixed integer linear problems, but little has been done for non-linear optimization. To bridge this gap, we develop a learning framework for spatial branching and show its efficacy in the context of the Reformulation-Linearization Technique for polynomial optimization problems. The proposed learning is performed offline, based on instance-specific features and with no computational overhead when solving new instances. Novel graph-based features are introduced, which turn out to play an important role in the learning. Experiments on different benchmark instances from MINLPLib and QPLIB show that the learning-based branching selection and learning-based constraint generation significantly outperform the standard approaches.
-
Andrea Simonetto (ENSTA, France)
Title: Quantum computing for optimization: a tour d'horizon in variational algorithms with some look at quadratic assignment problems and beyond
Abstract: Quantum computing is a new computing paradigm that has been advocated to be transformative for hard combinatorial optimization. The real picture is however more nuanced, having some quantum-based algorithms offering new heuristics to well-established problems. In this talk, I will give you a tour d'horizon in variational algorithms on quantum computers for combinatorial problems. I will point out challenges, opportunities, and the hope we have in make quantum optimization useful. In particular, I will present some initial results in quadratic assignment problems and permutation-based problems in general.