1

I understand the maximum likelihood estimate for a signal with Gaussian noise corresponds optimization of the least squares distance, in both linear (OLS) and nonlinear functions (nonlinear least squares regression).

Is the opposite of this statement also true? Is the definition of the sum of squared distances between two objects as a metric equal to placing the assumption that both belong to a Gaussian distribution?

Alternative could be, for example, that the Gaussian is not the only exponential stochastic model that has a "sum of squared residuals" kernel.

hirschme
  • 1,120
  • somethings missing in your question. OLS can be used for any error distribution with some constraints like spherical errors. however, it seems that you require that the least squares should be optimal in some sense. – Aksakal Mar 28 '18 at 20:55
  • @Aksakal Maybe the term "spherical errors" is the generalization I was looking for, however I cannot find any formal definition. A google result points solely to "non-spherical errors" articles, and most relate to either heterskedasticity or non-independence. Could you recommend me a resource that helps me differentiating "spherical errors" from "iid gaussian errors" ? – hirschme Mar 28 '18 at 21:13
  • @Aksakal " The term "spherical errors" will describe the multivariate normal distribution" , so it is just another name I guess. This just leads me to think that least squares is in fact tightly linked to assumption of gaussianity – hirschme Mar 28 '18 at 21:58
  • Spherical errors are not the same as multivariate normal. Multivariate normal Canby spherical but doesn’t have to be. – Aksakal Mar 28 '18 at 22:15
  • Maybe a duplicate ? https://stats.stackexchange.com/questions/173621/linear-regression-any-non-normal-distribution-giving-identity-of-ols-and-mle – kjetil b halvorsen Aug 04 '23 at 02:04

0 Answers0