Suppose that we have a model with many parameters, which we'll partition into two subvectors called $\theta$ and $\lambda$. In this situation, $\lambda$ corresponds to those parameters that are really of interest, and $\theta$ are nuisance parameters. In particular, we would really like to look at quantities like $P(x|\lambda)$, $P(x)$ and $P(\lambda|x)$, meaning we marginalize on the $\theta$. However, performing the integral over all $\theta$ may be hard.
However, for each $\lambda$, we may not even care about every possible value of $\theta$ for each lambda. We may only care about the best possible value of $\theta$ for that $\lambda$. This would suggest we look at something like this "partial maximum likelihood"
$$ Q(x|\lambda) = \max_\theta P(x|\theta,\lambda) P(\theta|\lambda) = \max_\theta P(x, \theta|\lambda) $$
This is a interesting function in its own right. If we treat it as a likelihood function we can easily derive the related quantities $Q(x) = \sum_\lambda Q(x|\lambda) P(\lambda)$, and $Q(\lambda|x) = Q(x|\lambda) P(\lambda)/Q(x)$. It isn't a true likelihood as the sum on all possible $x$ for $Q(x|\lambda)$ need not be 1, but some quantity which may be very close to $1$. We could try to normalize or just not care, as even if it isn't, the function $Q(\lambda|x)$ is a true probability distribution as these normalizing terms cancel.
Question: what is this called? It doesn't seem to be quite the partial likelihood. I guess one could say it is a quasi-likelihood? It would make sense to call it a "partial maximum likelihood," I guess, except I am not quite sure if it is the same as this thing. Or can we perhaps treat the posterior as a true posterior on some other related quantity?