3

I would like to write a constraint as follows, where $x,y>0$ are optimization variables, and $a,b,c,d,A$ are positive constants. How to make it a convex constraint?

\begin{equation} \frac{{ax}}{{\ln (b + cy)}} + dx \le A \end{equation}

2 Answers2

8

The constraint is not convex, and is not transformable to a convex constraint without substantively changing it.

The additive linear term $dx$ is irrelevant to convexity. So let's ignore it and look at the simple case of $a = b = c = 1$. The Hessian of $\frac{{ax}}{{\ln (b + cy)}}$ at $x = y = 1$ has one positive and one negative eigenvalue. Hence $\frac{{ax}}{{\ln (b + cy)}}$ is neither convex nor concave. Therefore the constraint is not convex. In order for the constraint to be convex, that term would have to be convex, which it is not.

Mark L. Stone
  • 13,392
  • 1
  • 31
  • 67
  • Thank you very much for your kind reply. The Hessian of the function you mentioned is calculated as follows:\begin{equation}\small \begin{aligned} \left[ \begin{array}{l} 0\
    • \frac{{ac}}{{{{\ln }^2}(b + cy)(b + cy)}}

    \end{array} \right. \left. \begin{array}{l}

    • \frac{{ac}}{{{{\ln }^2}(b + cy)(b + cy)}}\

    \frac{{a{c^2}x\left( {\ln \left( {cy + b} \right) + 2} \right)}}{{{{\ln }^3}\left( {cy + b} \right){{\left( {cy + b} \right)}^2}}} \end{array} \right] \end{aligned} \end{equation}

    – qinqinxiaoguai Jun 20 '21 at 17:49
  • From it we know this function is concave. However it is not a convex constraint. Am I right? – qinqinxiaoguai Jun 20 '21 at 17:49
  • 1
    A constraint is convex if all points satisfying the constraint is a convex set of points. For example, constraining (x, y, z) to be within a cube is a convex constraint because all points satisfying the constraint is a solid cube and a solid cube is convex. I think he mean the function is not concave and is not convex. Therefore, the function can't be the boundary of a convex set of points. Think about a parabola that opens upward (f(x) = x^2). The function is convex. All points (x, y) satisfying y > x^{2} is a convex set of points and f(x) is the boundary of that set of points. – Qurious Cube Jun 20 '21 at 18:02
0

Transform the optimizing variables $x$ and $y$ in everything (the whole model) to $u = \frac{ax}{\ln(b + cy)}$ and $v = x$

Transformed constraint $u + dv \le A$ is linear.

Therefore, the transformed constraint in terms of $u$ and $v$ is convex.

Qurious Cube
  • 523
  • 2
  • 7
  • Disclaimer: I have never done anything like this before. Don't be surprised if I make any mistake. I would be happy if anyone point out the mistakes though. – Qurious Cube Jun 20 '21 at 19:20
  • Thank you very much! Your insights give me a further understanding of convex and Hessian. I have a further question about positive definite of hessian matrix: the positive definite of hessian matrix is discussed over $p,q\in \mathcal{R^2}$, can we discuss the convexity of constraints over a restricted domain? Could you please show me any example, websites or papers which uses the similar idea? Many thanks! According to your comments, if domain of a,b,c,d and A as well as the domain of x and y satisfy the additional constraints you provided, the constraint is convex in the restricted domain? – qinqinxiaoguai Jun 20 '21 at 23:53
  • The idea of convexity of a function $f$ over an convex open set $S$ is from the "mathematical methods for economic theory" link. I think some subsets of $\mathbb{R}^{2}$ are convex sets (e.g. all points within a triangle or a square that is within $\mathbb{R}^{2}$. – Qurious Cube Jun 21 '21 at 00:28
  • Many transformation can change the convexity of a set of points/constraints. For example, a cube is convex but if you deform it by punching it, the result is probably no longer convex. Deforming the space (as in relativity) has the same effect. Therefore, you can't talk about convexity without considering the optimizing variables (or the coordinate system). – Qurious Cube Jun 21 '21 at 00:34
  • The transformed constraint, $u > 0$ and $v > 0$ together is a convex constraint in terms of $u$ and $v$ if the following are true. 1. $u$ and $v$ are your optimizing variables. 2. you write everything in terms of $u$ and $v$ 3. you add the additional constraints $u > 0$ and $v > 0$. By "together" I mean something like this: consider a piecewise function f(x) = x^2 if x > 0 and y = -x^2 if x < 0. The function f(x) is neither concave nor convex. f(x) < 2 is not a convex constraint. However, if you add a constraint x > 0, then f(x) < 2 and x > 0 together is a convex constraint. – Qurious Cube Jun 21 '21 at 00:44
  • The number of transformation of the optimizing variables that can make your constraint convex is infinite. The particular transformation that might luckily work for you depends on the rest of your optimization problem. I made a wild guess and pick 1 transformation. – Qurious Cube Jun 21 '21 at 00:48
  • Thanks a lot for your kind explanation. Your proposed transformation totally works for my problem. According to your suggestion, I tried to make it can be entered in CVX. $g(u,v)=u+suv−v$, I tried to use $e^p$ and $e^q$ to replace with $u$ and $v$ so that production $uv$ can be converted to summation $u+v$. However $e^{e^q}$ occurs because $y_1+y_2<B (constant)$ is an original constraint, which violates DCP rules. – qinqinxiaoguai Jun 21 '21 at 02:16
  • You are experienced with this extensive guess. Could you please tell me how to learn to transform the optimizing variables that can make a constraint from non-convex to convex so that I can try other transformations? Or do you have other ideas of how to make $g(u,v)$ can be accepted by CVX? Many thanks! – qinqinxiaoguai Jun 21 '21 at 02:18
  • Well. I really have never done something like this. And sorry I think I find a mistake in my derivation. The mistake is: H is positive definite if and only if $h(p, q) > 0$ for all $p \ne 0$ and $q \ne 0$. A function over a domain (that is convex subset of $\mathbb{R}^{2}$) is convex only if its Hessian within the domain is positive-definite. – Qurious Cube Jun 21 '21 at 02:37
  • Sorry, I can't understand very well why you say it is incorrect. Actually my problem is as follows:

    \begin{array}{l} \mathop {\min }\limits_{x,y} \sum\limits_{n \in { 1,2} } {\frac{{a_n{x_n}}}{{ln(1 + c_n{y_n})}} + b_n{x_n}} \ \frac{{a_n{x_n}}}{{ln(1 + c_n{y_n})}} + b_n{x_n} \le {A_n},\forall n \in { 1,2} \ x_1 + x_2\le 1\ y_1 + y_2 \le B\ {x_n} > 0,{y_n} > 0 \end{array} According to your comments, I think use $u=\frac{a}{A}x $ and $v = \ln(1+cy)$ can make H positive definite over domain $u>0$ and $v>0$. And $u>0$, $v>0$ can be satisfied in my problem. What mistake it is?

    – qinqinxiaoguai Jun 21 '21 at 03:14
  • Note that $a$, $b$, $c$, $A$ and $B$ are positive. Look forward to your reply. Thanks a lot! – qinqinxiaoguai Jun 21 '21 at 03:16
  • $h(p, q, u, v)$ needs to be positive for all non-zero $p$ and $q$ for the function $g$ to be convex over $u$ and $v$. It is ok for $h$ to be negative over some $u$ and $v$. That just means $h$ is not convex over those $u$ and $v$. In general $h$ is a function of $p$, $q$, $u$, $v$. It just happens that the 2nd derivative of $g$ doesn't have $u$ and $v$. – Qurious Cube Jun 21 '21 at 03:17
  • $h(p,q,u,v)>0$ cannot be satisfied because $p,q$ should be non-zero vector in $\mathcal{R}^2$. The definition of $p,q$ is uncorrelated to $u,v$. In any case, the constraint we discussed is non-convex. Am I right? – qinqinxiaoguai Jun 21 '21 at 03:32
  • yup. Since $H$ is not positive-definite for all $u$, $v$, the function $g(u, v)$ is not convex everywhere. – Qurious Cube Jun 21 '21 at 03:35
  • Sorry, did you mean "for all $p,q$" or "for all $u,v$"? Actually, the definition of $g(u,v)$ is $u>0,v>0$. – qinqinxiaoguai Jun 21 '21 at 03:44
  • for all p, q. see 1st definition in https://en.wikipedia.org/wiki/Definite_matrix – Qurious Cube Jun 21 '21 at 03:49
  • Thank you very much! I have better understood it. – qinqinxiaoguai Jun 21 '21 at 03:59
  • Try $u_{n} = \frac{x_n}{\ln(1 + c_n y_n)}$ and $v_{n} = x_{n}$ All constraints seem to be either linear or convex if $y_n > 0$. I don't have time to check. – Qurious Cube Jun 21 '21 at 05:43
  • For the B constraint, first get rid of it and solve. If the answer satisfies the B constraint, you are done. If not, see https://or.stackexchange.com/questions/4701/global-optimization-when-the-exponential-function-is-involved But really, put your whole problem into a new question. You will get better help that way. – Qurious Cube Jun 21 '21 at 06:10
  • Thank you very much for your comment! Unfortunately, $e^{\frac{v_1}{u_1}}+e^{\frac{v_2}{u_2}}\le B$ is non-convex. Anyway, thank you for your contribution. I will try to post all the problem in a new question. Best regards! – qinqinxiaoguai Jun 21 '21 at 06:45
  • 1
    The proposed solution won't work for at least two reasons. First, it assumes that $x$ and $y$ can be recovered from $u$ and $v$. If the solver chooses $u=2$ and $v=0$, good luck solving for $y$. Second, it ignores the presence of $x$ and $y$ elsewhere in the model. So the solver will choose values for $u$, $v$, $x$ and $y$ and there is nothing requiring the chosen values of $u$ and $v$ (or at least $u$) to conform to the chosen values of $x$ and $y$. – prubin Jun 21 '21 at 21:06
  • hmm. I meant to transform the optimizing variables ($x$ & $y$) in everything (such as the $B$ constraint) to $u$ & $v$. Then, the original constraint $x > 0$ transforms to $v > 0$. That means $v \ne 0$. I edited the answer and added "in everything (the whole model)" to clarify this point. The problem is that the transformed "$B$" constraint is not convex. But that is not part of the question. I simplified the model and tried to prove that no transformation can work for the whole model in https://or.stackexchange.com/questions/6475/existence-of-a-transformation-to-convex-optimization – Qurious Cube Jun 22 '21 at 00:23
  • Comments are not for extended discussion; this conversation has been moved to chat. – ə̷̶̸͇̘̜́̍͗̂̄︣͟ Jun 22 '21 at 06:37