1

I have a general question: Is random search a heuristic or a metaheuristic. Actually, as far as I understand, metaheuristics define a general principle to approximately solving an optimization problem that is applicable to a big variety of problems. Thus, I would infer, that random search is also a metaheuristic, as it can be applied to (almost?) every optimization problem. I would argue that in a nuthshell, it is not really different from evolutionary algorithms or particle swarm optimization. All of the don't make use of problem-specific characteristics as a heuristic usually does.

Edit: By random search I am refering to a method that just randomly samples a fixed number of points in the solution space and at the end uses the best solution after evaluating all those solutions.

What is your take on this?

PeterBe
  • 1,632
  • 2
  • 15
  • 30
  • What do you mean with random search? Just choosing a random solution from a pre-defined neighborhood structure? – PeterD Jun 06 '23 at 15:46
  • @PeterD: Thanks Peter for the good remark. I edited my question to describe what I mean by random search – PeterBe Jun 06 '23 at 15:48
  • 2
    I would vote for metaheuristic. – prubin Jun 06 '23 at 15:51
  • I would also vote for metaheuristic since this logic can be applied to a broad range of optimization problems. However, I would not agree on your statement that they are not really different from evolutionary algorithms and PSO. There is a reason why these algorithms give SOTA results for some problems, something that would be hardly (or most often not) achievable by just randomly searching in the solution space. – PeterD Jun 06 '23 at 16:05
  • I personally find the term metaheuristic too ambiguous. I've encountered too many different definitions. So, I don't use it anymore – fontanf Jun 06 '23 at 20:38
  • 2
    @fontanf True, but I think etymology prevails: heuristics are specific to a given problem, metaheuristics are generic (hence "meta"). In which case I agree with prubin. – Kuifje Jun 06 '23 at 20:56
  • @Kuifje what prevails is to be understood – fontanf Jun 07 '23 at 09:40
  • I vote for calling it "throwing spaghetti at the wall and seeing what sticks". – Mark L. Stone Jun 08 '23 at 22:11

1 Answers1

3

I think the answer depends on what school you belong to. I've heard many define metaheuritics as generic heuristics in the sense that they can be applied to a broad range of optimization problems. In the way I was taught, however, metaheuritic are algorithms built on top of heuristics. These heuristics can be generic or problem specific, typically parameterized and often randomized, but all have in common that they take some input and produce a single solution. The job of the metaheuristic is then to iteratively select which heuristic to run (e.g., one that constructs a new solution, or one that combine/improve old) and which data to input (e.g., updating pseudo-costs, tabu lists or probability distributions) to strike a balance between climbing local peaks (local optimization) and not being caught in the neighborhood of one forever (diversification).

In this perspective, your description of random search is perhaps the simplest possible metaheuristic in existence as it only uses one heuristic (constructive random selection) and reruns it repeatedly where only the internal state of the random number generator changes from run to run. In some sense it represent an extreme in the metaheuristic space of algorithms with 100% diversification and 0% local optimization. Simplistic as it is, however, it is guaranteed to find the optimal solution in finite time if the solution space is finite, which is a theoretical property often sought in metaheuristics.

  • Thanks Henrik for your answer (I upvoted it). The other question I have is what is the differece between a local optimization method (like gradient descent and newton method) and a (meta)heuristic. Actually the heuristic also finds the local optimum. Is it actually the randomness that makes a heuristic not being labeled as a local optimizaiton method? – PeterBe Jun 07 '23 at 09:21
  • 1
    I think you will find it helpful to read the answers to https://or.stackexchange.com/questions/709/are-metaheuristics-ever-practical-for-continuous-optimization – Henrik Alsing Friberg Jun 08 '23 at 07:09
  • @Hendrik: Thanks Hendrik for your answer. Any comment about my other question about the differece between a local optimization method (like gradient descent and newton method) and a (meta)heuristic. The link does not answer this question as far as I see. – PeterBe Jun 28 '23 at 08:07
  • Local optimization refers to any type of optimization under relaxed optimality conditions. That is, they will give you feasible points that might not be true optimum. With gradient descent and newton method you only satisfy the first-order optimality condition (gradient equal zero), and with RINS you are only optimal with respect to a given neighborhood. – Henrik Alsing Friberg Jun 29 '23 at 09:29
  • Thanks Hendrink for your answer. If I take your definition "Local optimization refers to any type of optimization under relaxed optimality conditions. That is, they will give you feasible points that might not be true optimum.", metaheuristics like evolutionary algorithms will also be considered as local optimization. But I don't think that this is appropiate. They are rather regarded as methods for (heuristic)global optimization as far as I know – PeterBe Jun 29 '23 at 12:14
  • I believe the solution from a local optimization method must satisfy some set of relaxed optimality conditions. As example, an improvement heuristic typically defines some set of moves by which it tries to improve the solution and, when it halts, the solution is optimal with respect to the neighborhood defined by those moves (that is, it gives a local optimality guarantee). In contrast, metaheuristics use diversification strategies when there is no more progress to be made locally. – Henrik Alsing Friberg Jun 29 '23 at 13:10
  • Thanks a lot for your answer. I really appreciate it. But are local optimization methods like the newton method then regarded as exact optimization methods or heuristics? I would assume that it is still an exact method and the exact methods can be subdivided into global and local optimization. But I don't know if this is in line with the mathematical definition. It it possible that (meta)heuristics lead to better results than local optimization methods as (meta)heuristics are more or less global optimization methods (altough they are not exact methods) – PeterBe Jul 04 '23 at 13:56
  • Exact algorithms solve to global optimality as opposed to heuristic algorithms that have no such guarantee. You can have a newton-type method that works as an exact algorithm for some type of problems and as a heuristic algorithm for others. Even in the heuristic case, however, it will most likely provide local optimality guarantees and thus be categorized under the subset of heuristics known as local optimization methods. – Henrik Alsing Friberg Jul 05 '23 at 12:48
  • Thanks Henrik for your answer. You wrote "Exact algorithms solve to global optimality" --> according to this definition the newthon method would not be regarded as an exact method as it itself can't gurantee to find the global optimum (but just the local one) if the problem is non-konvex. – PeterBe Jul 06 '23 at 10:07
  • 1
    Exactly. Newton-type methods are not exact algorithms for non-convex problems, but as I said, they can be exact methods for other types of problems (e.g., strongly convex) depending on implementation details (see, e.g., this list of failure scenarios that needs to be addressed). – Henrik Alsing Friberg Jul 06 '23 at 10:47
  • @Hendrik: Thanks a lot Hendrik for your answers and your effort. I really appreciate it that you share your knowledge with me. So are Newton-type methods kind of a separate class of optimization algorithms, called local optimization algorithms? As far as I see, there are the 4 types of opimization alogirthms: 1) Exact methods, 2) Problem-specific heuristics, 3) Metaheuristics, 4) Local Optimization methods. How do you see this? – PeterBe Jul 12 '23 at 08:56
  • Thanks a lot for your answers. Any comments to my last comment? I'll highly appreciate every further comment from you. – PeterBe Jul 22 '23 at 14:37
  • 1
    I like your categorization although I think it is more common to denote (2) as constructive heuristics and (4) as local search heuristics. See, e.g., https://en.wikipedia.org/wiki/Constructive_heuristic. The best constructive heuristics, however, are usually those that are more problem specific ;-) – Henrik Alsing Friberg Jul 24 '23 at 05:35