When $Z = (Z_1, \ldots, Z_p)$ has a standard multivariate Normal distribution (all means are zero and the covariance matrix is the identity), then for $p\times p$ matrices $Q$ and $\Delta,$
$$X = Q \Delta Z$$
has covariance matrix
$$\operatorname{Cov}(X) = E[X X^\prime] = Q\Delta E[Z Z^\prime] \Delta^\prime Q^\prime = Q(\Delta\Delta^\prime)Q^\prime.$$
Thus, if you wish this matrix to equal a specified covariance matrix $\Sigma,$ it suffices to diagonalize $\Sigma.$ The Spectral Theorem says there exists an orthogonal matrix $Q$ and diagonal matrix $\Lambda$ for which
$$\Sigma = Q \Lambda Q^\prime$$
and, because $\Sigma$ is positive semi-definite, all the diagonal entries of $\Lambda$ have square roots. Let $\Delta$ be the diagonal matrix of the square roots of the entries of $\Lambda.$
You generate values of $Z$ by generating independent standard normal values of the $Z_i.$
Here, then, is a quick and dirty R function to generate n iid realizations of $X.$ It uses eigen to compute $Q$ and $\Lambda$ and sqrt to take the square roots. Each realization is a column in the output matrix.
rmnorm <- function(n, Sigma) {
with(eigen(Sigma, symmetric = TRUE), {
p <- length(values)
tcrossprod(vectors %*% diag(sqrt(pmax(values, 0)), p), matrix(rnorm(n * p), n))
})
}
The pmax(...) stuff deals with situations where floating point error creates tiny negative eigenvalues. To avoid further problems, we replace them with true zeros before attempting to take a square root.
It will be convenient to encapsulate the Matérn covariance in a function. Here is its correct evaluation. The only numerical twist is that we have to specify its value at $0$ explicitly, for otherwise we might find ourselves multiplying $0$ by an infinity.
C <- function(x, nu = 1, rho = 1, sigma = 1, tol = 1e-8) {
d <- abs(x) * sqrt(2 * nu) / rho
ifelse(d <= tol, 1, (d^nu * besselK(d, nu)) * 2^(1-nu) / gamma(nu)) * sigma^2
}
The rest is a matter of looping. As far as responding to the exercise goes, I think it's more revealing (a) to use $\nu\in\{1/2, 1, 4\}$ because the qualitative nature of a series with $\nu=1$ will be visibly different than series with $\nu\gt 1,$ all of which rather similar; (b) to reduce the realizations from $100$ to around $30$ so you can see the individual series; and (c) to use several hundred points instead of $50$ to show the detail.

Theory explains that the "jaggedness" of each series is determined by how differentiable the covariance function is at a lag of zero, so I have plotted the covariance functions for reference.
Below is the looping code. It uses one optimization: the entries in the covariance matrix are the values of $C$ at all the possible distances between the time points $x.$ There are only $150-1$ distances with $150$ evenly spaced points, even though the matrix contains $150^2 = 22,500$ entries. We therefore save a great deal of computing by first calculating those $150-1$ values (in the array s, which does double duty by providing the heights for the plot of $C$) and then scattering them appropriately through the covariance matrix S. This makes the code execute almost instantaneously.
x <- seq(0, 1, length.out = 150) # The evenly-spaced time points
mu <- seq(1, 4, length.out = 30) # Mean height of each series in its plot
cols <- hsv(runif(length(mu), 0.02, 0.65)) # A range of random colors, one per series
set.seed(17) # To reproduce the figure exactly
par(mfcol= c(2,3)) # Place the plots in a 2 X 3 grid
for (nu in c(1/2, 1, 4)) { # The smoothness parameters to use
s <- C(x - x[1], nu = nu)
plot(x - x[1], s, type = "l", ylim = 0:1, lwd = 2, xlab = "Lag", ylab = "",
main = bquote(paste("Matérn Covariance for ", nu==.(nu))))
S <- outer(seq_along(x), seq_along(x), (i,j) s[abs(i - j) + 1])
y <- scale(rmnorm(length(mu), S), scale = FALSE) + rep(mu, each = length(x))
plot(range(x), range(mu) + c(-1, 1), type = "n", xlab = "x", ylab = "Value",
main = bquote(paste("Time series for ", nu==.(nu))))
sapply(seq_along(mu), (j) lines(x, y[, j], col = cols[j]))
}
par(mfrow = c(1,1))
Rand Python) at https://stats.stackexchange.com/questions/120179. Perhaps that answers your question? – whuber Aug 18 '23 at 20:53