Given two matrices $A$ and $B$, I'd like to find vectors $x$ and $y$, such that, $$ \min \sum_{ij} (A_{ij} - x_i y_j B_{ij})^2. $$ In matrix form, I'm trying to minimize the Frobenius norm of $A - \mbox{diag}(x) \cdot B \cdot \mbox{diag}(y) = A - B \circ (x y^\top)$.
In general, I'd like to find multiple unit vectors $x$ and $y$'s in the form $$ \min \sum_{ij} (A_{ij} - \sum_{k=1}^n s_i x_i^{(k)} y_j^{(k)} B_{ij})^2. $$ where $s_i$'s are positive real coefficients.
This is equivalent to singular value decomposition (SVD) when $(B)_{ij} = 1$.
Does anybody know what this problem is called? Is there a well-known algorithm like SVD for the solution of such problem?
(migrated from math.SE)