7

Is there a faster way to compute the nuclear norm (trace norm, sum of singular values) of a matrix A than computing SVD(A) directly (or diagonalizing A^*A)?

I am particularly interested in the case where A is square. Assuming that A is real would be OK too. I am thinking in the limit of large matrices.

Brent
  • 171
  • 4
  • 2
    This was posted to Mathematics SE here – Brent Sep 27 '16 at 14:33
  • 1
    Are you really just interested in computing this norm, or are you interested in finding a matrix that minimizes the norm perhaps subject to some constraints? Do you have reason to believe that your matrix is of low rank? – Brian Borchers Sep 27 '16 at 21:05
  • 1
    For my problem I already know the minimum, or I'm not interested in it anyway. I expect it to be nearly full-rank. Thanks – Brent Sep 28 '16 at 20:42
  • The intuition here is that computing this one number from a matrix should be easier than iteratively computing the whole singular value spectrum to great accuracy. However, I haven't come up with anything. – Brent Oct 04 '16 at 13:23

1 Answers1

1

Depending on your scenario, perhaps you could make use of the result from this paper by Recht, Fazel and Parrilo, "Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization", 2007.

If the matrix $X$ has rank upper-bounded by $r$, then:

$$ \left \Vert X \right\Vert_* = \mathrm{inf}_{L,R} \left[\frac{1}{2}\left \Vert L \right\Vert_F^2 + \frac{1}{2}\left \Vert R \right\Vert_F^2 : X = LR^{T}\right ] $$

Feng, Xu and Yan used this result to derive their online robust PCA algorithm to remove any need for a singular value decomposition.

dr.blochwave
  • 389
  • 1
  • 11