13

Given a general non-linear problem:

\begin{align}P:\qquad&\min_{x\in X} f(x)\\\text{s.t.}\qquad&g(x)\leq 0\end{align}

where $f$ is a non-linear function, $g$ is a vector of non-linear constraints, and $X$ is a box-constrained domain, is there a way to formulate an optimisation problem to prove whether $P$ is convex or not?

This is typically handled through heuristic procedures (symbolic compositions of functions, eigenvalue arithmetic, Dr. AMPL, etc.) but I'm curious about how to formulate an optimisation problem to definitively prove/disprove the convexity of a problem regardless of computing cost.

We know from the theory that proving/disproving convexity of an NLP is an NP-hard (global optimisation) problem in the general case, but I can't find a formulation. Does one exist? Are you aware of a reference describing the formulation?

Nikos Kazazakis
  • 12,121
  • 17
  • 59
  • 1
    If h(x) is nonlinear, then unless the feasible set is a singleton, such as $x^2 = 0$, the odds of the problem being convex are not good, to put it mildly. – Mark L. Stone Sep 03 '19 at 17:24
  • @MarkL.Stone You're right, I updated the question to restrict the scope to linear equality constraints. – Nikos Kazazakis Sep 03 '19 at 17:28
  • Now that you have edited the problem, the box constraints and (general) linear equality (as well as inequality) constraints, are convex, and therefore can be removed from the formulation without affecting the problem's convexity. – Mark L. Stone Sep 03 '19 at 17:30
  • Also true, I removed the equality constraints altogether. – Nikos Kazazakis Sep 03 '19 at 17:44
  • 1
    The problem is equivalent to deciding whether ${x: f(x) \leq \theta, g(x) \leq 0}$ is convex. One way to disprove convexity is to find a $x_1, x_2, \lambda$ so that $x_1, x_2$ are in the set but $\lambda x_1 + (1-\lambda)x_2$ is not. Therefore we can consider an optimization problem where we maximize the infeasibility of $\lambda x_1+(1-\lambda)x_2$ subject to the feasibility of $x_1, x_2$. The objective is bilinear but you could bisection search over $\lambda$. Of course, you need to be able to optimize over $g(x)$ for this formulation to work, so i'll keep this as a comment. – Ryan Cory-Wright Sep 03 '19 at 19:14
  • Since working with general convex functions may not be that tractable, something else to consider is verifying whether the objective function and domain are Sum-of-Squares-convex, particularly if $f(x)$ and $g(x)$ are polynomials. Roughly speaking, the set of SOS-convex polynomials is a strict inner approximation of the set of convex polynomials, which can be optimized over using sum-of-squares/SDP; see here for a characterization of the difference between SOS-convexity and convexity. – Ryan Cory-Wright Sep 04 '19 at 04:12

1 Answers1

14

Based on the comment by Ryan Cory-Wright, you could formulate it like this.

Verify convexity of the domain $\{x \in X : g(x) \le 0\}$

Solve the following problem, and check the optimal value.

\begin{align} \max\qquad& g\left(\lambda x + (1-\lambda) y\right) && \small\textrm{(maximize constraint violation of convex combination)}\\ \text{s.t.}\qquad& g(x) \le 0 && \textrm{($x$ is feasible)}\\ & g(y) \le 0 && \textrm{($y$ is feasible)}\\ & 0 \le \lambda \le 1 && \textrm{(define convex combination)}\\ & x, y \in X && \textrm{(definitions $x$ and $y$)}\\ \end{align}

If the optimal value is (strictly) positive, then the set is not convex, as $x$ and $y$ are both in the set, while their convex combination $\lambda x + (1-\lambda) y$ is not. If the optimal value is non-positive, then the set is convex.

Verify convexity of $f$ on the domain $\{x \in X : g(x) \le 0\}$

Solve the following problem, and check the optimal value.

\begin{align} \max\qquad& \small f(\lambda x + (1-\lambda) y) - \left(\lambda f(x) - (1-\lambda) f(y)\right) && \small\textrm{(maximize violation of convex combination)}\\ \text{s.t.}\qquad& g(x) \le 0 && \textrm{($x$ is feasible)}\\ & g(y) \le 0 && \textrm{($y$ is feasible)}\\ & 0 \le \lambda \le 1 && \textrm{(define convex combination)}\\ & x, y \in X && \textrm{(definitions $x$ and $y$)}\\ \end{align}

If the optimal value is (strictly) positive, then there exists a convex combination such that $f(\lambda x + (1-\lambda) y) > \lambda f(x) + (1-\lambda) f(y)$, which violates convexity. If the optimal value is non-positive, $f$ is convex.

Solving the above problems

As you already mention, solving the above problems could be very difficult, even if $f$ and $g$ are actually convex. Things can become even more difficult if the functions are 'weird'. For example, $$g(x) = \begin{cases}\textrm{0 if $x \in \mathbb{Q}^n$}\\\textrm{1 if $x \notin \mathbb{Q}^n$}\end{cases}$$ seems very problematic!

Kevin Dalmeijer
  • 6,167
  • 1
  • 17
  • 48
  • 5
    You'll probably should add some benevolent assumptions about the functions, for example continuity or (sub-)differentiability. If you do not make any assumptions, functions like the example that you gave will complicate things extremely. – JakobS Sep 04 '19 at 08:26