1

Stochastic process is usually defined as a family of random variables $X: \Omega \times T \rightarrow \mathbb{R}$. A realization of this process can be written in the series $x_i = X(w, t_i)$ for $t_i \in T$ and $w \in \Omega$. This is my where my first concern arise. In this notation (borrowed from textbook) $X$ seems to be the same RV. But at the same time my understanding was that at every time point $t_i$ a stochastic process could be the realization of a different RV (e.g. $X_{t_i}$). In that case, what happen if corresponding $\Omega$ don't have the same sample spaces for these distinct RV? How to properly (formally) write down a stochastic process realization?

In another textbook, I have read the following quote and I don't really get was is meant. In general, it made me think about why one would want the joint distributions of a stochastic process to be time-invariant (stationarity) and how that allows us to draw conclusions about multiple realizations? I'm having trouble connecting the dots.

In most cases, we observe only one realization $x_t(w)$ of the stochastic process (a single $w$). Hence it is clear that we need additional assumptions, if we want to draw conclusions about the joint distributions (which involves many $w$’s) from a single realization. The most common such assumption is stationarity.

Dime
  • 123
  • 1
    Does our thread on the definition of stochastic processes resolve your concerns? If not, what additional explanations are you looking for? – whuber Jan 20 '23 at 16:23
  • 1
    @whuber This clarifies first (big) chunk about the formalism of stochastic processes. I was seeing stochastic processes in reverse. i.e. a function of time producing a RV $X(t): \textrm{RV}(w) \rightarrow \mathbb{R}$ instead of a RV producing a function of time $X(w): T \rightarrow \mathbb{R}$. Thank you for that! – Dime Jan 20 '23 at 17:04
  • @whuber Doesn't help me for the need of stationarity though. Once the $w$ is fixed why should we care about a time-invariant function? How stationarity allows to draw conclusions about many $w$'s using a single $w$ realization (as said in quote)? Still struggle to connect the dots here. – Dime Jan 20 '23 at 17:04
  • I am struggling to understand the question because it reads like several questions at once. Could you please edit it to focus on what you need to know? I do notice that according to this limited definition of "realization," each realization is a single observation of an $n$-variate random variable. You can't deduce much from that! But when that realization is of a very special nature (such as involving equally spaced times) and (when the times are not discrete) you make strong assumptions in addition to stationarity, you can indeed estimate some properties of the process. – whuber Jan 20 '23 at 17:12
  • @whuber Definition of SS is that the process $(X_{t_1}, \ldots, X_{t_k}) \sim (X_{t_1 + h}, \ldots, X_{t_k + h}) \forall t_k, h \in \mathbb{Z}$. From $X(w): T \rightarrow \mathbb{R}$, the "time-dimension" appears after sampling $X$ at $w$. I feel then that stationarity brings nice properties only to the function $T \rightarrow \mathbb{R}$ and not to the process $X(w)$ itself. I don't see how stationarity gives nice properties of $X$ itself (different $w$'s) – Dime Jan 20 '23 at 17:41
  • 1
    Stationarity concerns the distributions, not the specific (full) realizations of the process. It does not assert equality of random variables! – whuber Jan 20 '23 at 19:11
  • @whuber I am not 100% sure to get this. I know stationarity concerns distributions. But still, SS definition requires joint distributions of the process $X_t(w) = X(w)(t)$ to be time-invariant. This is precisely where I get confused. Because the time component of stochastic process $X(w)(t)$ comes after the realization producing the function of time $T \rightarrow \mathbb{R}$. I have then trouble understanding how the distributions (dependent only on $w$'s) of $X(w)(t)$ can be related to time-invariance (where time $t$ of process $X(w)(t)$ is consumed after realization $w$). – Dime Jan 20 '23 at 21:03
  • I truly don't understand the sense in which a "time component" can "come after" a realization. – whuber Jan 20 '23 at 21:36
  • @whuber Let's put it differently. We defined stochastic process as $X(w) : T \rightarrow \mathbb{R}$. This is somewhat clear. Now, how the process distribution is exactly defined? This may be source of confusion. For RV $X : \Omega \rightarrow \mathbb{R}$ CDF is defined as $F_X: x \mapsto \mathbb{P}({ w \in \Omega | X(w) \leq x }) $ for $x \in \mathbb{R}$. How the time-component of random process precisely fits in that definition? – Dime Jan 21 '23 at 09:05
  • 1
    Fix a finite sequence $(t_1,t_2,\ldots, t_n)$ of times $t_i\in T.$ The variables $X_{i}: \omega\to X(\omega)(t_i)$ have an $n$-variate distribution. Those are the distributions about which Kolmogorov's Theorem is concerned. – whuber Jan 21 '23 at 16:18

0 Answers0