19

Recently I saw the following question Breakthroughs in mathematics in 2021 by Johnny Cage on mathoverflow, which inspired me to ask the following question:

"As specialised operations researchers we usually miss important results outside our area of research.

So, generally speaking, what are some major important breakthroughs since 2010 in different disciplines of operations research?"

Discrete lizard
  • 1,482
  • 6
  • 25
Sune
  • 6,457
  • 2
  • 17
  • 31

6 Answers6

11

Machine Learning Under a Modern Optimization Lens by Dimitris Bertsimas and Jack Dunn, is a fascinating book in the space of OR & ML.

The book provides an original treatment of machine learning (ML) using convex, robust and mixed integer optimization that leads to solutions to central ML problems at a large scale that can be found in seconds/minutes, can be certified to be optimal in minutes/hours, and outperform classical heuristic approaches in out-of-sample experiments.

Optimal classification trees - this paper presents optimal classification trees, a novel formulation of the decision tree problem using modern MIO techniques that yields the optimal decision tree.

VChourasiya
  • 181
  • 5
  • 4
    Where is the "breakthrough" here? – Robert Bassett Jan 04 '22 at 13:27
  • 5
    I would say it is extremely refreshing to say the least to apply OR techniques in a new field. It is especially the case when almost everyone uses heuristics for those problems. But why use heuristics when you can solve a problem optimally in reasonable time? The breakthrough would be potentially breaking an industry standard using exact algorithms while everyone else uses heuristics – DaPurr Jan 04 '22 at 14:04
11

Hedetniemi conjectured in 1966 that $χ(G×H)=\min\{χ(G),χ(H)\}$ for all graphs $G$ and $H$. Here $G×H$ is the graph with vertex set $V(G)×V(H)$ defined by putting $(x,y)$ and $(x',y')$ adjacent if and only if $xx'∈E(G)$ and $yy'∈E(H)$.

It turns out this is not true, a Russian mathematician disproved the conjecture after 53 years in 2019, and this is considered a breakthrough.

There is a good video by Numberphile where everything is well explained.

Kuifje
  • 13,324
  • 1
  • 23
  • 56
10

In terms of the metric TSP, Karlin et al. (2020)1 developed a novel approximation algorithm that beat the infamous Christofides algorithm by a very modest difference of $10^{-36}$!

This was a long unresolved problem, and whilst the improvement may look minimal, the new techniques used can pave the way for even better upper (worst-case) bounds in the near future. The history of the endeavour dating back to 2010 has been featured in Quanta.


Reference

[1] Karlin, A. R., Klein, N., Gharan, S. O. (2021). A (slightly) improved approximation algorithm for metric TSP. In Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing. 32-45. [preprint published on arXiv in 2020]

7

INFORMS recongnizes the best contributions in OR/MS with the Frederick W. Lanchester prize, that is:

"The Lanchester prize is awarded for the best contribution to operations research and the management sciences published in English in the past five years."

The 2021 prize was awarded to D. Bertsimas and J. Dunn, as outlined in the other answer. The 2020 prize was awarded to P. Mohajerin Esfahani and D. Kuhn for their contribution on Data-driven distributionally robust optimization (DRO) using the Wasserstein metric, that can be found here. The breakthrough in the DRO method is absolutely a major one, as after years of research, the authors showed that under mild assumptions, the DRO problem can be formulated as finite convex and even tractable linear programs. Since then, this method is widely utilized to solve many problems, and by now (05/01/2022) has been cited 800 times in google scholar.

A full list of the winners of the Frederick W. Lanchester prize can be found here.

Mostafa
  • 2,104
  • 10
  • 28
6

Quadratization without auxiliary variables

For decades, people have been converting arbitrary optimization problems into linear ones, which makes them far easier to solve. With so much recent progress in solving quadratic optimization problems, it's now also become customary to try to convert these problems into quadratic ones, especially if it's not practical to convert the problem all the way into a linear one. In 1975 [1], Rosenberg gave us a method to quadratize unconstrained binary optimization problems by introducing auxiliary (or "slack") variables. Unfortunately, quadratization using auxiliary variables often makes the number of variables so large that it completely defeats the purpose of quadratizing in the first place, as I have still not found a satisfying answer to this question: Are there any real-world problems where quadratization helps to solve something that couldn't have been solved without quadratization?

What happened in the last 10 years?

The number of new ways to quadratize optimization problems has exploded since the 1975 Rosenberg paper, and as I wrote in my 2019 book on quadratization (emphasis on the last part of the sentence was added here), "Part I gives more than 40 different ways to do this, almost all of them published in the last 5 years." In fact only two of the 40+ methods were published before 2011, largely because of [the explosion of interest in using graph cut algorithms to solve computers vision problems in the prior decade of 2000-2010 (see this related OR.SE question). However, every quadratization method until 2014 still needed auxiliary variables (newer methods simply used fewer auxiliary variables or improved the quadratization in some other way).

What happened since 2014?

Methods to quadratize arbitrary binary optimization problems without auxiliary variables. At least five such methods popped up between 2014-2018. In my book these are presented in order of which methods I'd recommend to apply first, and are described in the following subsections of Chapter 1.1:

References:

  • [1] I. G. Rosenberg, “Reduction of Bivalent Maximization to the Quadratic Case,” Cahiers du Centre d’Etudes de Recherche Operationnelle 17, 71–74 (1975).
Nike Dattani
  • 1,278
  • 6
  • 32
4

"A counterexample to Hirsch Conjecture" by Francisco Santos.

Warren Hirsch conjectured in 1957 that the graph of a $d$-dimensional polytope with $n$ facets cannot have diameter greater than $n−d$. That is, any two vertices of the polytope can be connected by a path of at most $n−d$ edges. Conjecture could have strong implications, as the diameter of a polytope is a lower bound on the number of steps needed by the simplex method, whereas the upper bound was set to be exponential in the size of the problem using the Klee-Minty cubes demonstrated by Victor Klee and George Minty in 1973.

Link to paper: https://annals.math.princeton.edu/2012/176-1/p07

most optimum
  • 131
  • 3
  • 2
    I spent a couple of days during a grad school vacation > 40 years ago trying to prove or produce a counterexample to the Hirsch conjecture. If successful, that would have made me a big cheese. I was not successful. – Mark L. Stone Jan 12 '22 at 23:45