10

Given a quadratic program $$ f^* \equiv x^\top Q x + b^\top x \\ x \geq 0 \\ A^\top x = d \\ x \in \mathbb{R}^n $$ I would like to analyze the sensitivity of the solution $x^*$ to perturbations in $Q$ and $b$. Could anything be said regarding variations of $x^*$ depending on for instance variations in $Q$ and $b$ with respect to a matrix norm? One special case that could be interesting is if the equality constraint is replaced by $1^\top x = 1$, i.e., the QP is solved on the simplex.

One special case analysis I found is for the linear least squares problem, see [On the Perturbation of Pseudo-Inverses, Projections and Linear Least Squares Problems, SIAM Review Vol. 19, No. 4 (Oct., 1977), pp. 634-662]. However, this does not generalize to the quadratic program in a straightforward manner.

ntrstd11
  • 235
  • 1
  • 5

1 Answers1

9

The sensitivity analysis of optimization problems is called parametric programming or sometimes "post-optimal analysis".

The short version is that you describe the variability of your optimization problem as a function of certain parameters, typically denotes as $\theta$. Then, you calculate the optimization problem as a function of $\theta$, i.e. $x^*(\theta)$. For QPs, the best way to do this is by solving the parametric version of the KKT conditions.

The solution of a parametric optimization problem is a series of so-called "critial regions", i.e. areas where an certain active set continues to be optimal despite a variation of the parameters. For each critical region you can calculate the analytical expression $x^*(\theta)$ (also for the Lagrangian multipliers).

There are some references around this topic that may interest you:

Apologies for self-referencing, I genuinely think these are good references though.

Richard
  • 3,459
  • 7
  • 19