5

I was thinking about the Traveling Salesman Problem the other day - for such types of discrete combinatorial optimization problems, do they have a "loss function"?

It seems that there is some "vague" function which takes inputs as different combinations of cities, and returns the total distance traveled if these cities are visited in that order.

Would this be (along with problems such as "scheduling") considered as "gradient free optimization"?

SecretAgentMan
  • 1,895
  • 2
  • 13
  • 39
stats_noob
  • 1,831
  • 7
  • 30

1 Answers1

8

In no particular order ...

  • I wouldn't call the objective function of the TSP "vague". It is quite explicit.
  • The term "gradient-free" applies to algorithms, not to problems. You might find yourself choosing between gradient-based and "gradient-free" algorithms for the same problem. The confusion may arise from the fact that you have to use a "gradient-free" algorithm when the problem's objective function does not have a gradient.
  • The simplex method is gradient-based, so any TSP algorithm that involves solving an LP will not be "gradient-free".
  • The Nelder-Mead algorithm for nonlinear optimization (confusingly known as the "simplex algorithm", but not that "simplex algorithm") is an example of a gradient-free algorithm.
prubin
  • 39,078
  • 3
  • 37
  • 104
  • thank you for your answer! Just to clarify: the objective function for tsp has no gradient? Thank you so much! – stats_noob Jan 23 '22 at 16:25
  • I would have thought that for all discrete combinatorial optimization problems, the objective function will not have a gradient? – stats_noob Jan 23 '22 at 16:32
  • The TSP objective function depends on how you define your variables. Most commonly, you will use binary variables $x_{ij}$, taking value 1 if $i\rightarrow j$ is part of the tour. The objective function is linear in the $x_{ij}$ variables, and linear functions have (constant) gradients. – prubin Jan 23 '22 at 17:00
  • Thank you for your reply! "Most commonly, you will use binary variables xij" - in this case, the objective function will not have a gradient, correct? – stats_noob Jan 23 '22 at 17:12
  • 3
    No, the objective function will have a gradient. The issue with discrete variables is not whether the objective function has a gradient but to what extent that gradient is meaningful/useful, since "small" changes to the variables do not make sense. – prubin Jan 23 '22 at 19:42
  • @ prubin: thank you for your reply! sorry to ask so many questions - if you have time, could you please help me understand this? In the case of the Travelling Salesman Problem - how could we take the "gradient of the objection function"? This seems impossible, no? Could you please expand on this? I can not thank you enough for your help! Thanks! – stats_noob Jan 23 '22 at 19:45
  • I'm moving this to a chat session. You should get the invitation from Stack Exchange. – prubin Jan 23 '22 at 20:09