The Variation of Information, $VI(X;Y)$, can be defined by the respective conditional entropies through the identity, $VI(X;Y) = H(X|Y)+H(Y|X)$.
I am curious as to what the relative weightings of the conditional entropies mean that make up the Variation of Information.
For example, if we let $w =\frac{H(Y|X)}{H(X|Y)}$, what would $w<1$ or $w>1$ mean with respect to the interaction of information between $X$ and $Y$? Would such a 'measure' serve any relevant purpose?