From some process I got a series of values. I want to compute the variance of the mean from this series. The series is built with contiguous sub-series. In each sub-series the values are correlated. All sub-series follow the same pattern but they have different total lengths.
I read on Wikipedia that for correlated variables holds
$$\operatorname{Var}\left(\overline{X}\right) = \frac{\sigma^2}{n} + \frac{n - 1}{n}\rho\sigma^2 $$
where $\rho$ is the average correlation.
Q1: What does average correlation means or how it is computed?
Intuitively I would expect it to be something like the lag or the sum of all lags of the autocorrelation function of the series, but I don't know.
Q2: Is this approach reasonable?
EDIT: I checked these questions:
According to comments, I should use:
$$\operatorname{Var}\left(\sum_{i=1}^n X_i\right) = \sum_{i=1}^n \operatorname{Var}\left(X_i\right) + 2\sum_{1\le i<j\le n}\operatorname{Cov}\left(X_i, X_j\right)$$
but I am not sure on how to compute the $\operatorname{Cov}\left(X_i, X_j\right)$.
If it helps, the data arises from an stationary process (although it always starts with a complete sub-series).
Variance of a sum of identically distributed random variables that are not independent
Variance of sum of dependent random variables
and this article.