I have a system whose performance is based on a rather large parameter set (200 parameters, lets say, of which each can take a very wide range of values). There are tests to evaluate the performance using a given parameter set, but it is expensive computationally, and takes time. Considering that, there may or may not be patterns relating how the parameters change and how the performance changes, is there a smart way to search this rather huge parameter space for optimum values that maximize the performance of the system, and minimize the computational overhead?
A trend MAY be guessed for some of those parameters, but not with much confidence. A test range may be specified by the user, for each of the parameters. Is there any other way that is definitely better than the brute force method of doing an exhaustive search? Something that looks for trends on-the-fly and avoids searching in a direction, maybe? But, it needs good statistical support to ignore a branch.
A subset of the parameter space may contain dependent variables, but we do not know any information about that. So, no assumptions there.