Attempting to understanding a statistical concept which I'm positive is basic stats, but that I currently don't understand. Say that there's a one in ten million likelihood of an outcome happening during an event, that happens a given count of times, and each time is roughly unrelated/correlated to another event. Do you the overall odds of the outcome happening change based on the count of events or not?
For example, say the odds are one in ten million that an outcome will happen and the event occurs 25 times. Is the outcome more likely, and if so, what is an explain of this, and how do the odds change from the one in ten million.
Ex: (1 in 10 million) chance the event happens translates to (9999999/10000000) chance it does NOT happen. Thus, (9999999/10000000)^25 is the chance it does NOT happen (at all) in a set of 25 tries. Then subtract this from 1 to get the chance it DOES happen in a set of 25 tries
– Joe Murray Oct 22 '14 at 18:50