1

Suppose I have N biased coins. The bias of each coin $j$ is known: $p_j$.

What is the probability that I throw at least K heads using all N coins and tossing them each once?

The edge case of at least one coin showing head ($K=1$) is clear to me: $P(K) = 1-\prod_{j=1}^N(1-p_j)$.

But I don't see how transforming the binomial distribution in a way so it deals with non-equal probabilities for each event... Thanks for your help!

broidul
  • 11
  • 1
    I am having trouble making sense of your equation $P(K) = 1-\prod_{j=1}^N(1-p_j)$. Can you clarify the notation and the thought process? – BruceET Mar 20 '22 at 21:42
  • 3
    This is called the poisson-binomial distribution, see the tag [tag:poisson-binomial-distribution] – kjetil b halvorsen Mar 20 '22 at 23:02
  • @BruceEt, I want to know the probability of at least one coin toss resulting in head in the simple example. The complement of this probability ($1-P(K)$) is no toss resulting in head - equals all tosses resulting in tails ($\prod_{j=1}^N(1-p_j)$ with $1-p_j$ being the probability of a tail for each coin. – broidul Mar 21 '22 at 08:31
  • @Kjetilbhalvorsen, thanks for your hint! This is what I came looking for! How is it possible to mark your comment as answer? If not possible can you put your comment in a separate answer? – broidul Mar 21 '22 at 08:54

1 Answers1

0

From your Question, I guess you may be confused by this problem. I will try to get you started with a few simple results.

The result is the sum of $N$ Bernoulli random variables $X_i.$ Let $N = 3$ and $p_1 = 0.3, p_2 = 0.4, p_3 = 0.6.$ It seems clear then $E(X_1) = 0.3, E(X_2) = 0.4, E(X_3) = 0.6,$ so that $$E(S = X_1 + X_2 + X_3) = 1.3.$$ Somewhat similarly, $$V(S) = (0.3)(0.7) + (0.4)(0.6) + (0.6)(0.4) = 0.690.$$ However, especially for large $N,$ I can see no elementary way to find the complete distribution of $S: P(S=0), P(S=1), P(S=2), P(S = 3).$

A few special results are easy: $P(S = 0) = (.7)(.6)(.4) = 0.168$ (which you may have had in mind for the equation in your question). Also, $P(S = 3) = (.3)(.4)(.6) = 0.720.$ [Perhaps generating functions or probability generating functions would help for $P(S =1)$ and $P(S = 2).]$

Simulation might give useful approximations. Below, I simulate a million realizations of $S$ for $n = 3$ and the probabilities used just above.

set.seed(2022)

x1 = rbinom(10^6, 1, .3) x2 = rbinom(10^6, 1, .4) x3 = rbinom(10^5, 1, .6)

s = x1 + x2 + x3

mean(s); var(s) [1] 1.299137 # aprx 1.30 [1] 0.6910407 # aprx 0.69

table(s)/10^6 s 0 1 2 3 0.168689 0.435489 0.323818 0.072004

The first tabled result is $P(S = 0) \approx 0.168$ and the last is $P(X = 3) \approx 0.720.$ Presumably, values for $P(S = 1)$ and $P(S = 2)$ are also reasonable approximations.

Note: In a a binomial model, we consider the sum of results on Bernoulli trials.

  • There must be a fixed number $N$ of trials (otherwise you may have a geometric or negative binomial).
  • The trials must be independent (if not independent because of sampling without replacement, the maybe the distribution is hypergeometric)
  • The probability of success must be the same on each trial (otherwise you have this question, perhaps the most difficult change in the binomial model to handle).
BruceET
  • 56,185
  • I have problems already in the beginning of your answer: you state that $E(S = X_1 + X_2 + X_3) = 1.3$. But in my opinion, this only holds only true if the events are mutually exclusive. If this is not the case as in my example above, you need to subtract the overlap: $E(S) = E(X_1)+E(X_2)+E(X_3)-E(X_1\cap X_2)-E(X_1\cap X_3)-E)X_2\cap X_3)+E(X_1\cap X_2\cap X_3)$ – broidul Mar 21 '22 at 09:29