TLDR: If George Costanza was supposed to do Robust Optimization, he would instead do Best Case Optimization, which is (sort of) the opposite of Robust Optimization.
Is there a literature or problem nomenclature associated with the concept of optimization with respect to the best case? That is sort of the opposite of Robust Optimization, which is optimizing with respect to the worst case.
Robust optimization optimizes the worst case outcome with respect to some defined universe of possibilities. For instance, the increasingly popular "Wasserstein distributionally robust optimization". It finds the optimum for the worst case within a specified Wasserstein distance.
Optimizing for the best case would find the optimum for the best case, within a specified Wasserstein distance (or possibly otherwise).
Roughly speaking, optimizing for the best case would let the optimizer choose the most favorable among a set of possible constraints, basically an "or"; whereas robust optimization essentially amounts to an "and".
Another analogy would be to Stochastic Programming. But in best optimization, the optimizer gets to choose the most favorable scenario to incorporate as a constraint.
For example, let's say that (at least) one out of n of the matrix constraints, $A_ix = b_i$ is satisfied, but we don't know which one. Best case optimization would solve
$$\text{min (w.r.t. to i,x)}: \text{f}(x), \text{s.t.} A_ix=b_i$$ So mini-min, not mini-max.