4

I have the following question given in Communication Systems by Dr Sanjay Sharma :- "Show that the random process $X(t) = A cos(\omega t + \theta)$ where $\theta$ is a random variable uniformly distributed in range $(0, 2 \pi )$ , is a wide sense stationary process." In the solution while calculating the mean, the author writes, $\mu _X(t) = \int_{-\infty}^{\infty}X f_X(x,t) dx$ and $f_X(x,t) = f_{\theta}(\theta) = \frac{1}{2\pi} U(0,2\pi)$. But while calculating mean of functions (before introducing random process) the book used the formula as $\mu _X = \int_{-\infty}^{\infty}x f_X(x) dx$. I am not able to get the meaning of the mean/expectation in random process (which one is random variable, which one is distribution function). according to me it should have been $\mu _X(t) = \int_{-\infty}^{\infty}\theta f_{\theta}(\theta) d\theta$.

PS. In Simon Haykins the formulae for mean is $\mu _X(t) = \int_{-\infty}^{\infty}x f_{X(t)}(x) dx$ that means the integration has to be performed wrt the same varible that is being multiplied to $f$.

euler16
  • 109
  • 1
  • 7
  • 1
    Depending on how you try to understand it, the expression "$\mu X(t) = \int{-\infty}^{\infty}X f_X(x,t) dx$" is either nonsensical or wrong. The right hand side needs to be $ \int_{-\infty}^{\infty}x f_X(x,t) dx$. Is this what you are asking about--a typographical error? – whuber Apr 19 '17 at 13:15

1 Answers1

0

$\mu_X(t)$ is a conditional expectation, which means it is a function of $t$ rather than a number as is the case for a regular expectation. Here $\theta$ is a random variable and $t$ is some variable (possibly to be made random at some later time) and $\omega$ is a fixed parameter. So $\mu_X(t)$ represents the mean value of $X$ at $t$, having integrated out the random variable $\theta$.

I would personally read this whole apparatus as $X$ being a family of functions of a random variable $\theta$ and some parameters $t,A, \omega$ so we could index a member of the family as $X_{t,A,\omega}$. We can (apprarently) obtain the expectation $E_{f(\theta)}[X_{t,A,\omega}(\theta)]$ for all members of the family in a closed form.

conjectures
  • 4,226
  • what exactly is meant by X(t) = Acos(wt + theta)? is it a distribution, I read in Haykins that X(t_1) is a random variable. Then shouldn't X(t_1) be equal to theta (which is a random variable) – euler16 Apr 19 '17 at 12:29
  • Given that the question concerns the concepts underlying the notation, I am concerned that characterizing $\mu_X(t)$ as a "conditional" expectation might further confuse the issue by (incorrectly) suggesting $t$ is a random variable. Later you refer to $t$ as a parameter. This means $\mu_X$ is indeed a function but it is not "conditional" in the sense of being conditioned on values of a random variable. Furthermore, although it is intended that many values of $t$ be considered in any application, typically $A$ and $\omega$ are fixed: maybe they shouldn't all be lumped as "parameters." – whuber Apr 19 '17 at 13:10
  • @euler16 $X(t)$ is a random variable, because (at least) $\theta$ is random and $X(t)$ is a function of $\theta$. $X(t)$ could not be a distribution as need not integrate to one. To take its expectation we need to know its distribution, but we don't. However we do know the distribution of $\theta$ and one could potentially express the density of $X$ transformed into $\theta$ (except that the relationship isn't straightforwardly invertible because $cos(-y)=cos(y)$)... blah, blah. Really this is just saying look at $\int_0^{2\pi} (1/2\pi)A\cos(\omega t + \theta) d\theta$. – conjectures Apr 19 '17 at 15:04