9

Let $X$ be a discrete random variable taking its values in $\mathbb{N}$. I would like to halve this variable, that is, to find a random variable $Y$ such as:

$$X = Y + Y^*$$

where $Y^*$ is an independent copy of $Y$.

  • I am referring to this process as halving; this is a made-up terminology. Is there a proper term found in the literature for this operation?
  • It looks to me that such $Y$ always exists only if we accept negative probabilities. Am I correct in my observation?
  • Is there a notion of best positive fit for $Y$? Aka the random variable that would be the "closest" to solve the equation above.

Thanks!

  • 1
    In the cases where you can't "halve" exactly, there are multiple possible definitions of "closest"; it depends on what you want to optimize. – Glen_b Oct 19 '15 at 00:11

3 Answers3

10

A notion strongly related to this property (if weaker) is decomposability. A decomposable law is a probability distribution that can be represented as the distribution of a sum of two (or more) non-trivial independent random variables. (And an indecomposable law cannot be written that way. The "or more" is definitely irrelevant.) A necessary and sufficient condition for decomposability is that the characteristic function $$\psi(t)=\mathbb{E}[\exp\{itX\}]$$ is the product of two (or more) characteristic functions.

I do not know whether or not the property you consider already has a name in probability theory, maybe linked with infinite divisibility. Which is a much stronger property of $X$, but which includes this property: all infinitely divisible rv's do satisfy this decomposition.

A necessary and sufficient condition for this "primary divisibility" is that the root of the characteristic function $$\psi(t)=\mathbb{E}[\exp\{itX\}]$$ is again a characteristic function.

In the case of distributions with integer support, this is rarely the case since the characteristic function is a polynomial in $\exp\{it\}$. For instance, a Bernoulli random variable is not decomposable.

As pointed out in the Wikipedia page on decomposability, there also exist absolutely continuous distributions that are non-decomposable, like the one with density$$f(x)=\frac{x^2}{\sqrt{2\pi}}\exp\{-x^2/2\}$$

In the event the characteristic function of $X$ is real-valued, Polya's theorem can be used:

Pólya’s theorem. If φ is a real-valued, even, continuous function which satisfies the conditions

φ(0) = 1,
φ is convex on (0,∞),
φ(∞) = 0,

then φ is the characteristic function of an absolutely continuous symmetric distribution.

Indeed, in this case, $\varphi^{1/2}$ is again real-valued. Therefore, a sufficient condition for $X$ to be primary divisible is that φ is root-convex. But it only applies to symmetric distributions so is of much more limited use than Böchner's theorem for instance.

Xi'an
  • 105,342
6

There are some special cases where this holds true, but for an arbitrary discrete random variable, your "halving" is not possible.

  • The sum of two independent Binomial$(n,p)$ random variables is a a Binomial$(2n,p)$ random variable, and so a Binomial$(2n,p)$ can be "halved".
    Exercise: figure out whether a Binomial$(2n+1,p)$ random variable can be "halved".

  • Similarly, a Negative Binomial$(2n,p)$ random variable can be "halved".

  • The sum of two independent Poisson$(\lambda)$ random variables is a Poisson$(2\lambda)$; conversely, a Poisson$(\lambda)$ random variable is the sum of two independent Poisson$(\frac{\lambda}{2})$ random variables. Indeed, as @Xi'an points out in a comment, a Poisson$(\lambda)$ random variable can be "halved" as many times as we like: for each positive integer $n$, it is the sum of $2^n$ independent Poisson$\left(\frac{\lambda}{2^n}\right)$ random variables.

Dilip Sarwate
  • 46,658
  • 2
    +1 My recollection is that the discrete uniform is a particular case where it's not possible (I believe there are numerous others, but it's one I have looked at). – Glen_b Oct 18 '15 at 15:33
  • Indeed, a uniform distribution is decomposable but not divisible in the above sense. – Xi'an Oct 18 '15 at 15:36
  • 2
    The Poisson distribution is one example of an infinitely divisible distribution, so can be divided in a sum of an arbitrary number of iid variates. – Xi'an Oct 18 '15 at 15:38
-1

The problem seems to me that you ask for an "independent copy", otherwise you could just multiply with $\frac{1}{2}$? Instead of writing copy (a copy is always dependent), you should maybe write "two independent, but identically distributed random variables".

To answer your questions,

  • what comes closest is maybe the term convolution. For given $X$, you are looking for two iid RV with convolution $X$.

  • if you accept negative probabilities, these are no longer random variables, since there is no probability space anymore. There are cases where you can find such $Y,Y^*$ ($X$ $\lambda$-Poisson-distributed, $Y$,$Y^*$ $\frac{\lambda}{2}$-Poisson-distributed), and cases where it is not possible ($X$ Bernoulli, as example).

  • i haven't seen any, and i can't imagine how to formalize such a best fit. Usually, approximations to random variables are measured by a norm on the space of random variables. I can't think of approximations of random variables by or to non - random variables.

I hope i could help.

mattd
  • 61
  • 3