I'm trying to solve a "simple" (= small) optimization problem often, with only minor changes to the objective function. Therefore it's important to keep the "time per solve" as low as possible. This problem is basically a QP with $Q := \operatorname{diag}(\alpha, \dots, \alpha)$ positive definite, and linear constraints, so of the following form:
$$ \begin{align} \min&\quad c'x + \alpha\cdot x'x \\ \text{s.t.}&\quad Ax \geq 0 \\&\quad x \geq 0 \end{align} $$
with decision variable $x \in \mathbb{R}^n$, as well as fixed $c \in \mathbb{R}^n$ and $A \in \mathbb{R}^{m\times n}$ and $\alpha \in \mathbb{R}_+$.
Is there any way to pre-calculate a "somewhat" analytical solution that is (as) fast (as possible) to re-solve with changing values for $c$ and $\alpha$?
Remarks: $n$ and $m$ are fairly small, and memory as well as pre-calculation time are (mostly) not relevant.
To clarify what I mean with "somewhat": If the term $\alpha\cdot x'x$ did not exist, one possible way would be to precompute the polytope defined by $Ax \geq 0$ and extract its vertices ($y_i$). The optimal solution $x^*$ must lie on one of those. Therefore we can calculate $c'y_i$ and select $x^* := y_j$ for one $j$ that satisfies $y_j \leq y_i \forall i$. This essentially boils down to one matrix-vector multiplication and one argmax, which is really fast.
Initial thought: since the set of feasible solutions is bounded, and $Q$ is positive definite, a minimum exists and can be found by looking at the KKT points (that will also be a global minimum since the objective function is convex). Finding those KKT points can be done by solving a bunch of equations (but with variable $c$ and $\alpha$, which will prevent really pre-calculating everything...).