0

Suppose we have a a set of matrices in the complex field of the form $a_iv_iv_i^H$ for $i=\{1,\dots,n\}$ where $a_i$ are constant positive real scalars and $v_i$ are constant complex valued finite dimensional matrices all of same dimension. So that makes $v_iv_i^H$ positive semidefinite (PSD). $I$ is the identity matrix.

Now we want to maximize the following determinant over $a_iv_iv_i^H$ for $i=\{1,\dots,n\}$ $$\mathrm{maximize}_{\{1,\dots,n\}}\:\det \left( I+\frac{a_iv_iv_i^H}{I+\sum_{j\neq i} a_jv_jv_j^H} \right).$$

Essentially we pick one matrix for the numerator and all the rest go in the denominator. Which matrix should go on the numerator?

P.S.: I believe this question is related to Determinant of sum of positive definite matrices.

MLT
  • 213
  • I think that $\det(a_iv_iv_i^H)=0$ for each $i$, so the sorting would not work. The maximum depends on the directions of the vectors. – Alex Degtyarev Jan 21 '14 at 18:03
  • @AlexDegtyarev If $a_iv_iv_i^H$ is PD then all eigenvalues are possitive, so det is non zero. But I will change the claim to reflect eigenvalues. Thanks a lot – MLT Jan 21 '14 at 18:40
  • $v_i v_i^H$ is of rank 1, i.e. all its eigenvalues except one are 0. – Dima Pasechnik Jan 21 '14 at 19:37
  • The product you mention equals $1+a_i|v_i|^2$. Still the claim is almost obviously wrong: if, say, all $a_i|v_i|^2=1$, the maximum depends dramatically on the directions of the vectors. – Alex Degtyarev Jan 21 '14 at 20:19
  • What I mean is that two collinear vectors may overpower an orthogonal vector of larger norm. I don't think there is an easy answer to your question. At least, not without a very bright idea :) – Alex Degtyarev Jan 21 '14 at 20:21
  • @DimaPasechnik $v_i$ are matrices not vectors. Therefore $v_iv_i^H$ may not be rank one, right? – MLT Jan 21 '14 at 20:34
  • @AlexDegtyarev Could you please explain which norm of matrix you take by $\mid\mid v_i\mid\mid$ ? They are not vectors. – MLT Jan 21 '14 at 20:39
  • Oops! Sorry, you've changed that part, too! Then it is even more complicated. You see, you are adding a bunch of non-commuting operators (as in general the eigenspaces are different), and this seems like a completely transcendental problem (in the bad sense of the word). Even if the operators commute, it is not immediately obvious which one is to be thrown out: just try $I$ plus several diagonal matrices. (I think, even in dimension $2$ that would depend on the matrices a lot: try it by hands.) – Alex Degtyarev Jan 21 '14 at 20:53
  • it is not advised to use lower case letters to denote matrices, especially if you use $I$ to denote the identity matrix, unless you want to confuse readers... – Dima Pasechnik Jan 22 '14 at 20:13

1 Answers1

2

Not a full answer for now, but just a comment to simplify slightly the problem.

Write the objective function as $$ \det \left( \frac{I+\sum_j a_jv_jv_j^H}{I+\sum_{j\neq i} a_jv_jv_j^H} \right). $$The numerator is constant and can be ignored, so you are really minimizing

$$\det({I+\sum_{j\neq i} a_jv_jv_j^H}).$$