2

In regularized regression, for example the ridge regression, we have the Lagrange method, which adds lambda times the 2-norm of parameters to the loss function and minimizes this. On the other hand, this is equivalent to minimizing the loss function subject to the constraint that the 2-norm of the parameters is less than K.

My question is, is there a explicit formula between lambda and K?

kaixu
  • 249
  • 1
  • 7

1 Answers1

2

Let $x^{\ast}$ be the solution of minimizing $f(x)+\lambda R(x)$, where $R(x)$ is the regularization term and $\lambda > 0$.

Now consider the minimization problem, where you want to minimize $f(x)$ given $R(x)\le k$ with $k > 0$. Set $k=k^{\ast}:=R(x^{\ast})$. Let $x^{\prime}$ be the solution of this particular minimization problem. This means of course that $f(x^{\prime})\le f(x^{\ast})$, and also that it holds $R(x^{\prime})\le k^{\ast}=R(x^{\ast})$ or equivalently $\lambda R(x^{\prime})\le \lambda R(x^{\ast})$.

But if one of the two inequalities $f(x^{\prime})\le f(x^{\ast})$ and $\lambda R(x^{\prime}) \le \lambda R(x^{\ast})$ where strict, than this would be a contradiction to $x^{\ast}$ being a minimizer of $f(x)+\lambda R(x)$. Thus, $f(x^{\prime}) = f(x^{\ast})$ and $x^{\ast}$ is also a solution for the second minimization problem, and $x^{\prime}$ is also a solution for the first minimization problem.

This shows, that solving the minimization problem of the first kind solves also a particular minimization problem of the second kind, and vice versa.

The relationship is $k=R(x^{\ast})$.

ghlavin
  • 477
  • So assume we have a lambda. If we need to find a formula for the corresponding K, we’ll need a closed form solution for x*. – kaixu Nov 01 '19 at 00:01
  • In my opinion, yes. A closed form solution would exist for ridge regression (L2 regularization), but it is the only regularization I know of, with this property. – ghlavin Nov 01 '19 at 00:12