0

I am interest in the constrained L1 Lasso problem: $$\min_{\beta\in \mathbb{R}^p:\sum_{i\in[p]}|\beta_i|=1} \|X\beta -s\|^2, (1)$$ for design matrix $X\in \mathbb{R}^{n \times p}$ and target $s\in \mathbb{R}^n$. In particular, I am interested in the setting where there are more features than examples, i.e. $p>n$ (and potentially $p\gg n$). Is there any algorithm that solves $(1)$ up to error, say $\epsilon$, with a computational complexity of order $O(p n^2 \log \epsilon^{-1})$ or something similar.

Specific requirements:

  • I want an algorithm with a sample complexity that scales linearly (up to potentially log factors) with the number of features $p$. For example, I do not want a complexity that scales with $O(p^k)$ for $k\geq 2$.
  • I specifically want a linearly-convergent algorithm, i.e. one that requires $O(\log(1/\epsilon))$ iterations instead of $O(poly(1/\epsilon))$.

Note that the answer given in Computational complexity of the lasso (lars vs coordinate descent) does not give me what I want; it mentions an algorithm with a computational complexity that scales with $p^3 + n p^2$ (if I adapt their notation to my setting).

0 Answers0