In Shapiro et al., when discussing about loss of molecules as source of error in single-cell sequencing, it is written that:
Another source of error is losses, which can be severe. The detection limit of published protocols is $5$–$10$ molecules of mRNA. If, as seems likely, the limit of detection is primarily determined by losses during sample preparation, this would indicate that $80$–$90\%$ of mRNA was lost. Or, to put it the other way around, a $90\%$ loss leads to an approximately $50\%$ chance of failing to detect a gene that is expressed at a level of seven mRNA molecules (from the binomial distribution).
How is this probability computed using the binomial distribution? I thought that $90\%$ loss corresponds to $5$ detected molecules, and I assume that $k=7$ for the binomial calculation, but I am unable to go further.
0.9 ** 7) that this is 0.4782969. That's probably what the modelization in terms of binomial distribution means, but to me it is much easier to understand in terms of this very basic probability calculation. – bli Dec 20 '17 at 13:25