2

I've come across some papers in where certain forecast errors are standardized to have unit variance. Unfortunately that's the only information they provide and I have no idea on how to obtain/calculate their results.

Assuming I have a vector of 3 year forecasts that i would like to standardized to have unit variance. I thus calculate the forecast errors such as actual value-forecasted value and get a vector of forecast errors.

I've came up with the following: $x$ equals my forecast errors of the 3 year forecasts

$x.stand=(x-mean(x))/std(x)$

is this correct or would this only be valid for one year forecasts?

Alexis
  • 29,850
Gritti
  • 161
  • 2
    You don't need to subtract mean(x) unless you also want it to have a mean of 0. Just dividing by the standard deviation will give it unit variance already. – David Robinson Aug 17 '14 at 15:24
  • Also: a variable standardized in the way you calculate is often represented with the symbol $z$. ($x-\bar{x}/s_{x}$ is sometimes called the "$z$ transformation", assuming $\bar{x} \approx \mu$, and $s_{x}\approx \sigma_{x}$.) – Alexis Aug 17 '14 at 16:05
  • Just in case anyone happens to be confused, Alexis meant to type $(x-\bar x)/s_x$ there. – Glen_b Aug 17 '14 at 16:38
  • @Glen_b is a sharp reader as usual. :) I definitely left out the parentheses in my haste. – Alexis Aug 17 '14 at 18:15

0 Answers0