I am having trouble "mapping" the variables in the Bayes equation onto the case of regression. As notation, say $$ P(\theta|D) = \frac{P(D|\theta) P(\theta)}{ P(D) } $$ I have come to think of $\theta$ as parameters of a compact model.
Notation for regression: $$ y = f(x,\theta) $$ or $$ y = f_{\theta}(x) $$
In regression we want to estimate both $y$ (after training) and $\theta$ (during training/fitting). Is the posterior one of these:
$p(\theta|y,x)$
$p(y|\theta,x)$
$p(y,\theta | x)$
$p(\theta|x)$
Or perhaps the idea of posterior does not apply in typical regression?