7

Kendall's tau is an association measure between two random variables, say $X$ and $Y$. If $X$ and $Y$ are independent, then $\tau = 0$. However, $\tau = 0$ does not imply independence because kendall's tau only captures monotonic association. Is there some conditions under which one has equivalence?

Edit to account for @Cardinal's comment

Say that $X > 0$ and $Y > 0$.

chl
  • 53,725
user7064
  • 2,213
  • There are quite a few, actually. Some come from statistics and other come from closely allied fields, like information theory. Specifying your model in a little more detail can yield more precise suggestions. – cardinal Mar 27 '12 at 15:04
  • 1
    This question needs disambiguation. Does it ask whether (a) there are conditions on $X$ and $Y$ that assure $\tau=0$ implies independence of $X$ and $Y$ or (b) there are alternative statistics other than $\tau$ whose vanishing implies independence of $X$ and $Y$, not matter how $(X,Y)$ are distributed? – whuber Mar 27 '12 at 15:18
  • 1
    Normally, $\tau$ is a sample statistic and is not considered a parameter of a joint distribution. Therefore, when $X$ and $Y$ are independent with joint distribution $F$ and $(x_i,y_i)$ is an iid sample from $(X,Y)$, $\tau$ is a function of this sample and we can say $\tau$ is $0$ only in expectation: $\mathbb{E}_F[\tau]=0$. In particular, $\tau=0$ does not even imply lack of correlation of $X$ and $Y$: it merely is statistical evidence of lack of correlation. If these considerations lead you to want to modify your question, then please go ahead and edit it. – whuber Mar 29 '12 at 17:51
  • 3
    @whuber: I think you might be thinking of the sample version of Kendall's $\tau$. As a population quantity, this is defined as $\tau = \mathbb P((X-X')(Y-Y') > 0) - \mathbb P((X-X')(Y-Y')<0)$ where $(X,Y)$ and $(X',Y')$ are iid bivariate vectors with continuous distributions. From this definition, it is clear that if $X$ and $Y$ are independent, then $\tau = 0$. The OP seems to be asking for a characterization of the reverse implication. – cardinal Mar 29 '12 at 23:55
  • @cardinal: Yes, indeed :-) – user7064 Mar 30 '12 at 04:24
  • Thanks, Cardinal: that is the clarification I was attempting to elicit. – whuber Mar 30 '12 at 13:59

1 Answers1

18

I am going to interpret your question as one regarding a hypothesis on the population quantity $\tau$. If this is not what you intended, please comment to that effect and I will revise the answer accordingly.

Definition and equivalent expressions

Let $(X,Y)$ by a bivariate random vector with a continuous joint distribution function and let $(X',Y')$ be an independent copy. Kendall's tau is defined as $$ \renewcommand{\Pr}{\mathbb P} \tau = \Pr( (X-X')(Y-Y') > 0 ) - \Pr( (X-X')(Y-Y') < 0) \>. $$

Here are some equivalent definitions under the stated hypotheses that help elucidate this quantity.

  1. $\newcommand{\sgn}{\mathrm{sgn}}\tau = \Pr( \sgn(X-X') = \sgn(Y-Y') ) - \Pr( \sgn(X-X')) \neq \sgn(Y-Y'))$.
  2. $\tau = \mathbb E \sgn(X-X')\sgn(Y-Y') = \mathrm{Cov}(\sgn(X-X'),\sgn(Y-Y'))$.
  3. $\tau = 4 \Pr(X<X',Y<Y') - 1$.

Note that $X-X'$ and $Y-Y'$ are symmetric about zero. From this and either (1) or (3), it is clear that if $X$ and $Y$ are independent then $\tau = 0$.

The question at hand, as I understand it, is to characterize the reverse implication.

A first characterization

Using (3) above, we have that $\tau = 0$ implies that $$ \Pr(X-X' < 0, Y-Y' < 0) = \Pr(X-X'<0)\Pr(Y-Y'<0) = 1/4 \>. $$ By symmetry, this implies that the probability in each of the four quadrants of the distribution of $(X-X',Y-Y')$ is 1/4.

We can give the following interpretation: $\tau = 0$ if and only if the event that $X-X'$ is positive or negative is independent of the event that $Y-Y'$ is positive or negative.

This is, at least on the surface, much weaker than what (full) independence of the random variables $X-X'$ and $Y-Y'$ would require, which is that, for each $(a,b) \in \mathbb R^2$, $$ \Pr(X-X'<a,Y-Y'<b) = \Pr(X-X'<a)\Pr(Y-Y'<b) \>. $$

A useful reduction by sufficiency

Since $X$ is assumed to have continuous distribution $F$, say, and $Y$ has continuous distribution $G$, say, then we can restate each of the above in terms of $U = F(X)$ and $V = G(Y)$. For example, (3) above becomes $$ \tau = 4 \Pr(X < X', Y < Y' ) - 1 = 4 \Pr( F(X) < F(X'), G(Y) < G(Y') ) - 1 \>, $$ which by substitution gives $$ \tau = 4 \Pr( U < U', V < V' ) - 1 \>. $$ Note that $U$ and $V$ are both (marginally) $\mathcal U(0,1)$ random variables.

Thus, without loss of generality we can consider bivariate distributions on the unit cube with uniform marginals. Such a distribution, usually denoted $C(u,v)$ in the bivariate case, is called a copula.

This also shows why Kendall's tau is invariant to monotone transformations.

A (partially) explicit answer in terms of copulas

Let $(U,V)$ have distribution function $C(u,v)$ with uniform marginals. The reformulation of (3) can be written in terms of $C$ as $$ \tau = 4 \Pr(U<U',V<V')-1 = 4 \mathbb E C(U,V) - 1 \>, $$ which provides us with a fairly explicit answer:

Lemma: For $(X,Y)$ with continuous marginals, $\tau = 0$ if and only if $\mathbb E C(U,V) = 1/4$ where $(U,V) = (F(X),G(Y))$ and $C$ is the distribution function of $(U,V)$.

Relation to independence and a counterexample

If $X$ and $Y$ are independent, so are $U = F(X)$ and $V = G(Y)$, in which case $C(u,v) = uv$. Clearly $\mathbb E C(U,V) = \iint_{[0,1]^2} u v \,\mathrm du \,\mathrm dv = 1/4$. But do other examples exist?

Counterexample: Consider the family of copulas $$ C(u,v) = uv + \alpha u(u-1)(2u-1)v(v-1)(2v-1) $$ indexed by the parameter $-1 \leq \alpha \leq 2$. For every value of $\alpha \in [-1,2]$, we have $\tau = 0$. Note that the subcase of independence corresponds to $\alpha = 0$.

cardinal
  • 26,862
  • Do you mean to say that Kendall's $\tau$ is invariant to all monotone transformations, or only strictly increasing transformations? From copula theory, we know that if $f(\cdot)$ and $g(\cdot)$ are strictly increasing functions, then $C_{f(X)g(Y)}(u,v) = C_{XY}(u,v)$. However, if $f(\cdot)$ and $g(\cdot)$ are strictly decreasing functions, then $C_{f(X)g(Y)}(u,v) = u+v-1 + C_{XY}(1-u,1-v)$. This is from Theorem's 2.6 and 2.7 of the reference: https://people.math.ethz.ch/~embrecht/ftp/copchapter.pdf – Kiran K. Dec 15 '16 at 19:16
  • I am wondering if the $\tau$ being invariant to decreasing transformationss holds, considering Theorem's 2.6 and 2.7 in the reference above? – Kiran K. Dec 15 '16 at 19:17
  • @Kiran: Thank you for a careful reading. I think if you look at the definition of $\tau$ (or restatements 1 or 2 above), you will see that the invariance holds in the cases of decreasing $f$ and $g$. – cardinal Dec 15 '16 at 19:51
  • thank you. I think what you are saying is: $P[sgn(X-X')] = P[sgn(f(X)-f(X'))]$, regardless of whether $f(\cdot)$ is increasing or decreasing, and same with $g(\cdot)$ applied to $Y$? – Kiran K. Dec 15 '16 at 20:20
  • 1
    Not quite. I'm saying that if the sign of $X - X'$ matches that of $Y -Y'$, then the signs of $f(X)-f(X')$ and $g(Y)-g(Y')$ will match (even though, in the monotone decreasing case, the signs themselves will get flipped). – cardinal Dec 15 '16 at 20:29
  • 1
    Just for reference, Theorem 5.1.8 in Nelsen's "Introduction to Copula's" - 2nd Edition, also confirms this, although indirectly. It states that for any measure of concordance $\kappa$, if $\alpha$ and $\beta$ are almost surely strictly monotone functions on $Ran X$ and $Ran Y$, then $\kappa_{\alpha(X),\beta(Y)} = \kappa_{X,Y}$. Since Kendall's $\tau$ is a measure of concordance, it follows that Kendall's $\tau$ is invariant to strictly monotone transformations (of any type :) – Kiran K. Dec 16 '16 at 18:05
  • @cardinal Sorry to ask this, but are you sure that definition (3) is correct? I mean, is it equivalent to the initial definition of $\tau$? No offence; it is just that I am not sure of how one can get to (3) from the initial definition. The only way I see is considering that $\mathrm{Pr}(X<X' \land Y<Y') = \mathrm{Pr}(X > X' \land Y > Y')$, and I am not sure this holds —OR DO I???? OH, WAIT.... – Vicent Jan 18 '24 at 08:31