Suppose there is a device that can measure binary data (0 and 1) with 90 percent accuracy. I have used this device and measure the same data twice (or in another scenario three times). If in both measurements the device shows 0, what will be the accuracy of two (or three times) measurement in total (i.e. what is the probability that data is actually 0)?
Hear is a possible solution but I am not sure if it is correct (or if it has a name).
If measuring twice: $$A = 0.9 + (0.9*(1-0.9)) = 0.99$$ If measuring three times: $$B = A + (.9*(1-A)) = 0.999$$