2

Let $W[k]$ be a stationary white noise with variance = 1

Question: Is $X[k] = W[k] + c \cdot W[k-1]$ white noise?

$c$ is a real number.

Alon
  • 215
  • 2
  • 12
  • Spoiler: Lecture notes say it isn't but I can't see why... – Alon Mar 11 '18 at 18:23
  • 4
    Hint: $W[k]+ cW[k-1]$ denotes a FIR filter. – Marcus Müller Mar 11 '18 at 19:05
  • 1
    Abother option at proving this would really be just writing down the definition of the autocorrelation of a signal $Y[k]$, then inserting $X[k]=W[k]+cW[k-1]$ into that definition. You know the AKF of white noise! – Marcus Müller Mar 11 '18 at 19:06
  • @MarcusMüller do you mind elaborating a little more on how I can derive the fact that this is not white noise by the fact that X[k] denotes a FIR filter? – Alon Mar 12 '18 at 10:12
  • 1
    You ask yourself, "what's the result of applying that filter to a white signal, is it still white?", and notice that no, since that filter is for $c\ne0$ not an allpass, that's not the case. – Marcus Müller Mar 12 '18 at 10:23

1 Answers1

8

Calculate the autocorrelation of the process.

$$\begin{align} R_{xx}[n] &=\mathbb{E}[(W[k] + c W[k-1])(W[k-n] + c W[k-1-n])] \\ &=\mathbb{E}[W[k]W[k-n]]+ \mathbb{E}[cW[k]W[k-1-n]]+\mathbb{E}[cW[k-1]W[k-n]]+\mathbb{E}[c^2W[k-1]W[k-1-n]] \\ &=\sigma^2\delta[n]+c\sigma^2\delta[n+1]+c\sigma^2\delta[n-1]+c^2\sigma^2\delta[n]\\ &=\sigma^2(1+c^2)\delta[n]+c\sigma^2\delta[n+1]+c\sigma^2\delta[n-1] \end{align}$$

The definition of white noise implies that $R_{xx}[n]=\sigma^2\delta[n]$, which is not the case here.

Tendero
  • 5,020
  • 6
  • 27
  • 46
  • 1
    Something is awry in this answer because the LHS is a function of $k$ while the RHS does not depend on $k$ at all after the second step. Furthermore, even if we take $n$ to be a typo for $k$, the alleged autocorrelation function is not an even function of its argument. – Dilip Sarwate Mar 12 '18 at 01:37
  • 1
    No time to derive right now but the correlation of $X$ at lag k is zero. The only non-zero correlation is at lag one and it's equal to $c^2 \sigma^2$. $X$ is essentially an MA(1). – mark leeds Mar 12 '18 at 03:59
  • Note that the notation probably confused tendero. A better notation would be $X(t) = W(t) + cW(t-1)$. – mark leeds Mar 12 '18 at 04:01
  • Also, the variance of $X(t) = (1 + c^2)\sigma^2$ which is tendero's first term. – mark leeds Mar 12 '18 at 04:58
  • one more correction: $c^2\sigma^2$ is the autocovariance at lag 1. not the autocorrelation. – mark leeds Mar 12 '18 at 05:00
  • @markleeds uh, what's the $\mu$ for white noise? you'll find that autocovariance = autocorrelation for zero-mean processes. – Marcus Müller Mar 12 '18 at 10:28
  • I, too think that n was a typo for k but still, shouldn't the function be an even function of its argument? – Alon Mar 12 '18 at 10:38
  • @DilipSarwate You're right, sorry for that. I meant to write $R_{xx}[n]$ at the beginning, not $k$. Also, I messed up in the third term of the autocorrelation, the delta was at $n-1$, not $n+1$. – Tendero Mar 12 '18 at 13:24
  • @JoschKraus I've corrected it, sorry for the delay. – Tendero Mar 12 '18 at 13:24
  • Thanks, I think I understood now. Gotta think about the fact that the "-1" delay in the W term causes a +1 delta impulse and the "-1" in the second W term causes a -1 delta impulse but I think I can figure it out – Alon Mar 12 '18 at 15:33
  • @JoschKraus As a hint: think of $k-1-n$ as $k-(1+n)$. – Tendero Mar 12 '18 at 15:39
  • @Downvoter I would like to know why you downovted the answer... – Tendero Mar 12 '18 at 15:39
  • Hi Marcus: The autocovariance = autocorrelation if mean is zero but the variance also has to be 1.0. The variance being 1.0 is not a requirement for white noise, atleast not in statistics-econometrics. – mark leeds Mar 12 '18 at 15:44
  • 1
    @markleeds Note that the only requirement for autocorrelation and autocovariance to be equal is that the mean is $0$. By definition, $$C_{xx}(k)=\mathbb{E}[(X(n)-\mu_x)(X(n-k)-\mu_x)]$$ $$R_{xx}(k)=\mathbb{E}[X(n)X(n-k)]$$

    If $\mu_x=0$, both are the same.

    – Tendero Mar 12 '18 at 15:51
  • hi tendero: In statistical world, cov of X,Y divided by [ square root of var of x and var of y] = correlation of x and y . So, in autocov case, autocov of x divided by var of x = autocorr of x. So, they are only equal when var of x = 1. But maybe definition is different in DSP ? – mark leeds Mar 12 '18 at 16:52
  • I have to leave but last sentence in the top part of this link implies that they are different because corr is normalized. I can find official statistical formula later. link is this : https://en.wikipedia.org/wiki/Covariance – mark leeds Mar 12 '18 at 16:57
  • Hi All: The formula for correlation of x and y is at the link that follows. Note that they use x,y which therefore need to replaced with x1 and x2 for the case of autocorrelation. In any case, the correlation is standardized by the variances as I described earlier.. The covariance is the numerator in the formula for correlation. http://www.statisticshowto.com/probability-and-statistics/correlation-coefficient-formula/ – mark leeds Mar 13 '18 at 04:32
  • Just one other thing since this discussion was much longer than I expected. The beginning of the document at the link that follows contains the derivations of what I stated earlier regarding the covariance and variance of the MA(1). https://mcs.utm.utoronto.ca/~nosedal/sta457/ma1-and-ma2.pdf. – mark leeds Mar 13 '18 at 04:42
  • @Tendero, According to https://en.wikipedia.org/wiki/Covariance_and_correlation, the difference between autocovariance and autocorrelation is a normalization using the variance, as Mark Leeds said, not that the means are zero. Since the variance in this case is one, they are the same. Perhaps you were thinking about RMS vs Standard Deviation. This has no impact on your proof which is pretty slick. I gave you an upvote to counter the downvote you mentioned. – Cedron Dawg Mar 13 '18 at 22:21
  • @Tendero, Oops, I put the "as Mark Leeds said" in the wrong spot. It should read ",,, the difference between autocovariance and autocorrelation is a normalization using the variance, as Mark Leeds said, not that the means are zero, as you said." Sorry. – Cedron Dawg Mar 13 '18 at 22:36
  • @CedronDawg Well, that was how I was taught the topic at college. Maybe I am recalling wrongly, though. Thanks for the observation and the upvote, btw – Tendero Mar 14 '18 at 01:15
  • @Cedron: Thanks for additional comment. But how do you know the variance is 1.0 ? I don't think that's a requirement for white noise, atleast not in statistics-econometrics. – mark leeds Mar 14 '18 at 04:59
  • @mark leeds, You're welcome. it is stated in the first sentence of the question: "Let W[k] be a stationary white noise with variance = 1" – Cedron Dawg Mar 14 '18 at 10:18
  • Good conversation, all! @CedronDawg: I edited your previous comment to be in line with the correction in your "Oops" comment. :-) – Peter K. Mar 14 '18 at 11:20
  • @CedronDawg Check out this answer by Matt. I think that Matt is never wrong (he's some kind of DSP deity), so now I can say with more confidence that this may be a matter of convention, or a difference between definitions used in stats and in signal processing. – Tendero Mar 14 '18 at 15:57
  • @Tendero, Matt L. is impressive indeed. There is also a logical fallacy knows as "appeal to authority". A variation is "If it is on the internet it must be true.", which my reference to Wikipedia falls under. So being a bit outside my comforts zone, I'm going to say that it seems that the distinction between [auto]correlation and [auto]covariance is normalization by the variance values. Correlation is dimensionless, while covariance is not. Matt seems to have disregarded this in his answer that you referenced. – Cedron Dawg Mar 14 '18 at 16:45
  • 1
    @Tendero, (continued) DSP frequenty (pun intended) deals with stationary signals where the means are zero, but assuming they are zero and altering the definition of correlation/covariance based on this assumption are two different matters. It certainly explains your recollection though. As more of a mathematician than an engineer, I find DSP folks are often a bit sloppy. Sometimes this is pragmatic, other times it introduces misunderstandings. – Cedron Dawg Mar 14 '18 at 16:45
  • @CedronDawg Yes, I agree that the definition I've always used might be wrong from a mathematical point of view. It appears to be a pretty spread misconception, though, among engineers. I'll try to get used to the normalization term for the future. (Loved the pun, btw!) – Tendero Mar 14 '18 at 16:48
  • @Tendero, Don't forget the missing means either. Side note: The greatest, most widespread, misconception I have come across is that leakage in a DFT is exactly described by the sinc function. That, and what the term "exact" means, and whether it is important. – Cedron Dawg Mar 14 '18 at 17:05
  • 1
    Hi Tendero, Cedron: When you write the covariance as an expectation, that E() has units and you don't want a "similarity" measure to be dependent on the units being used. So, correlation is just a scaling used to standardize the measure. Example:The t-stat ( another unitless measure ) in a simple linear regression can be written as a function of the correlation^2.. not covariance.^2. So, definitely not interchangeable in stats-land. Thanks to both of you for all the wisdom you provide to this list. – mark leeds Mar 14 '18 at 17:30