2

I understand why Gaussian Processes are considered "non-parametric", but why do most authors use non-parametric models for Bayesian Optimization?

What's the benefit of using such models as opposed to parametric approaches? (GLM, etc.)

Josh
  • 4,448

2 Answers2

4

Parametric models assume the samples are from a specific distribution e.g. from a mixture of Gaussians where the number of Gaussian components is known a priori. This is restrictive since for most real-word problems we cannot know beforehand how complex the data is. For example, a nonparametric method should find the number of Gaussian components itself. As you see in my example, the nonparametric method still assumes something, that the data is from a mixture of Gaussians. But does not assume the number of components. Gaussian processes are nonparametric too. It uses every single training point to build a basis. So GP methods are flexible and powerful. They can learn complicated distributions (or decision boundaries in case of Gaussian process classification).

Leila
  • 965
0

There's no any evidence why objective function should be linear, so nonparametrics like gaussian processes, random forests or neural networks are used as surrogates to approximate it as nonlinear function.

Sengiley
  • 366
  • you may say, let's use parametric models like n-degree polynomials or splines, but Gaussian Processes can be considered as generalization of them – Sengiley Jan 06 '17 at 20:24
  • Yes but could not one argue that there is no evidence, either, that an objective function would fit any other model? I.e. all models are equally likely to be a good fit in the absence of data, unless you know something about the objective function? – Josh Jan 09 '17 at 19:28