Questions tagged [bayesian-optimization]

Bayesian optimization is a family of global optimization methods which use information about previously-computed values of the function to make inference about which function values are plausibly optima.

Its applications include computer experiments and hyper-parameter optimization in some machine learning models.

Some variants of Bayesian optimization are especially appealing because they strive to keep the total number of function evaluations to a minimum. This is desirable when function evaluations are very expensive (either because it takes quite a bit of computing power, or because they require physical experiments).

Some methods require no knowledge about the function at all (derivatives, functional form, etc.). This is useful when there is no known function (such as the response surface for a model wrt its hyper-parameters) or the function is so complex that computing derivatives is impractical.

Some key resources about Bayesian optimization include the following papers:

Shahriari et al., "Taking the Human Out of the Loop: A Review of Bayesian Optimization" (2015).

Jones et al. "Efficient Global Optimization of Expensive Black-box functions." (1998).

193 questions
2
votes
1 answer

Gaussian process on hyperparameter tuning

I feel it is kind of circular to use GP for hyperparameter tuning, since GP has its own hyperparameters. Or is it the case that GP typically has less number of hyperparameters than the model we want to tune (say NNs), which mitigates the issue…
Sam
  • 363
2
votes
0 answers

What is the current highest dimension for efficient Bayesian Optimization

It is frequently stated that BO can only be efficient for under 10 dimensions, or thereabouts. By efficient, I mean that the optimum is reached in an acceptable time, say less than a day given all data. Please give the open source package and the…
1
vote
1 answer

Bayesian optimization - One run with 100 iterations vs ten runs with ten iterations

I was wondering whether it makes more sense to run Bayesian Optimization let's say ones with 100 iterations or ten times with ten iterations each. Which one should I favor? I would assume that the first is better since the probabilistic model of the…
So S
  • 553
  • 5
  • 9
1
vote
1 answer

Bayesian Optimization with transformed objective function

I have a sequential learning problem where I want to rank a group of jobs in each iteration--unlike conventional Bayesian approach, I am trying to find the ordering of jobs based on g(f(x)) where GP directly models f(x)--so, maximizing the posterior…
Nazgol
  • 11
0
votes
0 answers

Applying EI to GP-based multi-fidelity modeling

I am working on optimizing a GP-based multi-fidelity surrogate model and I found this publication very helpful: 'A Tutorial on Bayesian Optimization' by Peter I. Frazier (July 10, 2018). and I have some questions regarding the conclusions on…
Ann
  • 43