I have used Bayesian reasoning in my research work and it has been extremely useful. The book I have read is E.T. Jayne's Probability theory. The idea is to formulate propositions and then probability theory tells how to assign numbers (viz. probability) to those propositions, conditional on one's information and beginning from some prior probabilities.
A proposition is something that is decidedly either true or false, irrespective of any observer. Therefore for a given coin "Probability of Heads is $p$" is not a proposition, because in Bayesian view probability depends on the observer (his/her information) and is not an objective property of the coin (like its mass or temperature).
Suppose I toss the coin once and get Heads. I ask "What's the probability of Heads in the next toss?" Consider the propositions: $$H_k\equiv\textrm{Heads in $k$-th toss}\\ T_k\equiv\textrm{Tails in $k$-th toss}$$
Then my question is: $P(H_2|H_1)=?$
I assume uniform prior probabilities: $P(H_k)=P(T_k)=1/2$ for any $k$. The result of first toss must change the probability of Heads in the second toss (I'm not assuming that the coin is fair; if say 100 tosses were to turn up Heads then I would suspect the coin to be biased in favour of Heads and probability theory must indicate the same to me).
Bayes rule gives: $$P(H_2|H_1)=\frac{P(H_1|H_2)P(H_2)}{P(H_1)}=P(H_1|H_2)$$
This gets me nowhere. How do I get a number and thus update the probability of Heads with each toss?
In this post and some articles I read on the net, this issue is resolved by taking "Probability of Heads is $p$" as a proposition and then seeking its probability (which amounts to seeking the probability of a probability). This does give an answer. My only problem (which I believe is a major problem) is that the aforementioned statement is not a proposition and so asking for its probability is nonsense. What's a way out of this conundrum?