I have a problem and I believe there must be a machine learning technique to solve it, but I am new to machine learning and I have no idea where to start.
So, I have multiple multivariate parameter vectors $\mathbf x$ and corresponding output vectors $\mathbf b$.
$\mathbf b$ was obtained by a matrix $\mathbf A$ such that $\mathbf {A x = b}$.
The data $\mathbf x$ and $\mathbf b$ that I have contain noises; anyway I would like to estimate the matrix $\mathbf A$ using the data $\mathbf x$ and $\mathbf b$ that I have.
So the problem should be solving
$$\operatorname*{arg\,min}_{\mathbf A} \| \mathbf{AX - B} \|$$
where $\mathbf X$ is a matrix containing multivariate parameter vectors in its columns, and the corresponding output vectors are stored in $\mathbf B$ as its column vectors.
Can anyone guide me how can I attempt to estimate the matrix $\mathbf A$?
lm). – usεr11852 Mar 16 '15 at 18:46do not use lm, I wanted to steer away the OP from treating a sub-case as the general case. (In general any system $A^TAX = A^TB$ is compatible.) – usεr11852 Mar 16 '15 at 21:36