Iam really trying to understand convergence in probability, but I have an example that I am struggling to understand. Perhaps pointing towards a deeper issue. Take the following simple case of convergence in probability to a constant using the exponential distribution:
I don't understand how this works. Suppose that we are doing an experiment of how much time passes until a new customer enters into a shop. If this is the case, then I don't get is how this sequence of random variables is defined on the same probability space. After all a random variable is a function from events to the real numbers, but here, to put it simply though inaccurately, the function itself does not change. That is, it is not as if different real values are assigned to various outcomes.
Instead, what changes is the rate at which events take place, or in other words it looks like the underlying probability measure changes and not the random variables themselves. If I have a probability measure in hand for various events, surely this means that the random variable that models this experiment must adopt the same rate which is determined by this measure? If this is the case, how can a sequence like this exist and be defined on the same probability space?
Reference: https://www.probabilitycourse.com/chapter7/7_2_5_convergence_in_probability.php
