0

Are there any common ways to define a distance between two probablity distribution functions (i.e. how similar two distributions are)?

I am interested in the specific case where the random variable is discrete and can only take a finite number of values.

Tony
  • 101

1 Answers1

1

The Kullback-Leibler divergence (i.e. relative entropy) does exactly that, as long as the two distributions have the same support.

ocramz
  • 361