How can one efficiently sample $x \in \mathbb{R}^N$ from a multivariate normal distribution $x \sim \exp(- \frac{1}{2}x^T \Sigma^{-1} x)$ given a normalization constraint $x^T x = 1$? In my application, $N$ would typically be order 10 or 20. If the covariance matrix $\Sigma^{-1}$ were the identity, then the target distribution would be uniform in the $(N-1)$-dimensional sphere, and there are straightforward algorithms. I get stuck when $\Sigma$ is not the identity.
One can reframe the problem by working in the eigenbasis of $\Sigma^{-1}$. Effectively, we can assume that $\Sigma^{-1}$ is a diagonal matrix. Then $x^T \Sigma^{-1} x$ has the form $\sum_i \lambda_i x_i^2$, where $\lambda_{i=1,\dots N}$ denote the (real) eigenvalues of $\Sigma^{-1}$. The task can be alternatively written as a problem of sampling nonnegative real numbers $\{q\}_{i=1,\dots,N} \sim \exp(-\sum_i \lambda_i q_i / 2)$ subject to the constraint ("given that") $\sum_i q_i = 1$, but note the change of variables $x \rightarrow q$ introduces a Jacobian factor to the integration measure on $d q$ that makes it difficult to "integrate out" (marginalize over?) some of the $q_i$ variables.
This question has a similar flavor, but is not quite the same. Here, my constraint on $x$ is quadratic.
@whuber Made a very interesting connection with "sum of scaled $\Gamma(1/2)$ variables", with approximation methods discussed here: Generic sum of Gamma random variables
x^T x = 1, and cannot be rescaled. For my application, I ended up doing some form of MCMC sampling ofx, where some components ofxare exponentially more represented than others, and this works pretty well. Thanks again! – Kipton Barros Jul 04 '22 at 16:36