0

According to the source I'm referring to, this is the function for the uniform prior:

enter image description here

This was followed by:

enter image description here

where P(θ) is replaced by 1... why was that done?

Any help is greatly appreciated.

  • Your initial $P(\theta)$ looks like a prior distribution for $\theta$ uniformly distributed on $[0,1]$, though perhaps easier to read as a $\text{Beta}(1,1)$ distribution. Your later $P(\theta \mid n,N)$ is then presumably supposed to be the posterior distribution for $\theta$ after $N$ observations of which $n$ are $1$, still with support on $[0,1]$ and perhaps easier to read as a $\text{Beta}(n+1,N-n+1)$ distribution, though your denominator requires some deciphering. – Henry Apr 12 '22 at 19:29
  • 2
    This is all connected to your earlier questions https://stats.stackexchange.com/questions/571305/what-does-it-mean-for-a-binomial-distribution-to-be-normalized-over-something and https://stats.stackexchange.com/questions/571330/uniform-prior-is-this-description-of-the-function-correct - do you understand how to go from a prior distribution to a posterior distribution? – Henry Apr 12 '22 at 19:31
  • I don't think I do... because I don't know how the (1) came into the equation for the posterior distribution... – Richie Harvy Apr 12 '22 at 19:48
  • The (1) is included for the prior part in Bayes' formula$$p_1(\theta|n)=\dfrac{p_2(n|\theta)p_3(\theta)}{p_4(n)}$$with the notation using $p_i$ for all densities. – Xi'an Apr 13 '22 at 05:46

1 Answers1

1

You are confused because the notation the authors used is sloppy. What they meant is that the prior is constant (equal to 1) and since the likelihood is binomial distribution, $\theta$ is a probability, so it's bounded in the [0, 1] range. In fact, $(1)$ in the equation is the same as $P(\theta)$ above.

Tim
  • 138,066