0

Firstly, I just want to declare that I'm not a statistician and I apologize for any obvious errors.

Let's say I have a dataset with x and y values.

Now, I have a model with 10 parameters/coefficients that can give a predicted y value when given an x. This is not an analytical function. Then, I randomly sample different combinations of these 10 parameters (from some prior distributions) to calculate different models and find the one that minimizes the $\chi^2$ (or some measure of best fit) and say this is the best fit model.

The problem arises when I do the same thing again but start my sampling with different priors - and get a different set of parameters that fits the model as good as last time. So, the parameters are degenerate and the parameter space possibly has multiple local minima.

Is there a way to find all such possible combinations of parameters that creates models that fit the data well (upto some tolerance)? I have tried techniques like MultiNest but the problem persists. Thanks for your time.

  • I think a good start would be this post: https://stats.stackexchange.com/q/383731/6193 , containing several answers with a lot of options and references. – frank Jul 14 '22 at 10:08
  • @frank Thanks. Approximate Bayesian Computation seems to crop up quite a lot in that. I'll read up on it. – Agnibha Banerjee Jul 14 '22 at 15:23

0 Answers0