3

From: https://stats.stackexchange.com/a/236297/22199, I quote

A mixture distribution combines different component distributions with weights that typically sum to one (or can be renormalized). A gaussian-mixture is the special case where the components are Gaussians.

Does this mean the following is a regression equation for a Gaussian mixture model?

$$\hat{y} = p(\alpha_1 + \boldsymbol{\beta}_1.\mathbf{x}_1) + (1- p)(\alpha_0 + \boldsymbol{\beta}_0.\mathbf{x}_0)$$

Here, $p = p(\mathbf{x}_M)$ is a function of covariates $\mathbf{x}_M$ that outputs a probability of being in the first Gaussian distribution, versus the second. That is, $$\begin{split}p(\mathbf{x}_M) &= \operatorname{expit}(p(\mathbf{x}_M))\\ &= \frac{1}{1+ e^{-(\alpha_M + \boldsymbol{\beta}_M .\mathbf{x}_M)}} \end{split},$$

and $\alpha$'s are the constant terms, the $\boldsymbol{\beta}$'s denote vectors of coefficients, and $\mathbf{x}$ the vector of covariates.

Estimation of the constants and coefficients are done in the usual way by minimising the squared error $(\hat{y} - y)^2$.

Alex
  • 4,382
  • 4
  • 34
  • 57
  • In CDF domain, a robust fit and switching can make an analytic fit for the initial parameters of the GMM. I thought the textbook approach was to run k-means with the same number of components as the GMM, and use the best cv fit as the initial conditions, then iterate with EM. – EngrStudent Sep 29 '16 at 02:52
  • I understand that expectation maximisation is used, however I want to know whether the structural, internal workings, of fitting a GMM can be explained by the regression equation I wrote. – Alex Sep 29 '16 at 02:55

0 Answers0