0

Let $y \in \Bbb R^n$, $\Bbb 1$ be an n-vector with all its entries equal to $1$, and $Z \in \Bbb R^{n×p}$ with columns of unit norm and such that $Z^T \Bbb 1 = 0$. The elastic net is a penalized regression procedure for fitting the model $y ∼ \mathcal N (\beta_0 \Bbb 1 + Z \xi, σ^2 I)$, combining both ridge regression and the lasso. This is done by simultaneously including $L_2$ and $L_1$ terms in the penalty. Its minimization objective is given by $$\min_{\beta_0 \in \Bbb R, \xi \in \Bbb R^p} \big\{ \lVert y-\beta_0 \Bbb 1 - Z \xi \rVert_2^2 + \lambda_2 \lVert \xi\rVert_2^2 + \lambda_1\lVert \xi\rVert_1 \big\} $$ Show that this objective function can be reformulated into a lasso-type one (i.e. into a new penalized regression problem with new variables and only an $L_1$ penalty).

I thought of using Pythagore to join the two $L_2$ norms but I does not lead anywhere since the vectors are not orthogonal. I also thought of dividing the whole expression by $\lVert \xi\rVert_2^2$, so that the problem would be the same but the radius relative to the $L_1$ norm would depend on $\xi$ and I don't think we want that. Can we even get a nice closed form for this problem ?

Kilkik
  • 345

0 Answers0