3

I can't find any theorem regarding this. I know it works for normal / lognormal distributions and as the logarithm is an affine transformation and the cdf is increasing it seems plausible but i can't find any reason/proof in favour or against.

Many thanks!

Alex

Taylor
  • 20,630
Vanity
  • 73
  • This is true because the logarithm function is a monotonically increasing function. – kedarps Sep 21 '18 at 18:53
  • 2
    Although the left hand side makes sense for $x=-1,$ what about the right hand side? cc @Kedarps – whuber Sep 21 '18 at 18:55
  • @Kedarps thanks! do you maybe have any formal proof / theorem verifying that? :) – Vanity Sep 21 '18 at 18:57
  • 1
    Vanity, my comment provides a counterexample. It shows you need to impose some conditions on both $X$ and $x.$ – whuber Sep 21 '18 at 18:58
  • 1
    i guess only x > 0 has to hold for it to be satisfied right? – Vanity Sep 21 '18 at 18:59
  • @whuber, I should have specified that logarithm function is a monotonically increasing function, for positive values of $x$ :) I hope that's what you were getting at with the previous comment. – kedarps Sep 21 '18 at 19:00
  • BTW, the logarithm is not an affine transformation. See https://en.wikipedia.org/wiki/Affine_transformation for instance. – whuber Sep 21 '18 at 19:03
  • You are right, my bad :) – Vanity Sep 21 '18 at 19:07
  • 4
    There is not much to prove, since $[X > x]$ if-and-only-if $[\log X > \log x]$, provided that $X$ is non-negative. In general, if $g$ is a measurable bijection between two spaces $\mathcal X$ and $\mathcal Y$ and $X \in \mathcal X$ with probability $1$ then $P(X \in A) = P(g(X) \in g(A))$ for any measurable $A$. This is because the events are the same, you are just defining the event of interest in two different ways. – guy Sep 21 '18 at 19:10
  • For added clarity, $\log$ is a bijection between $[0, \infty]$ and $[-\infty, \infty]$ so $X$ should be non-negative almost surely and we should have $x \ge 0$. – guy Sep 21 '18 at 19:21
  • @guy - might want to post your comments as an answer, you'll get the (well-deserved) credit and everyone else will know this question has been answered! – jbowman Sep 21 '18 at 22:22

2 Answers2

5

The simple/heuristic explanation is for why this is true is that $[X > x]$ occurs if-and-only-if $[\log X > \log x]$ provided that both $X$ and $x$ are non-negative. Hence $[X > x]$ and $[\log X > \log x]$ are just two names for the same event, so the probability is the same.

I think the above description gets at the heart of what is going on, but since you asked for a "proof" I will also give something formal. Consider a probability space $(\Omega, \mathcal F, P)$ and let $X: \Omega \to \mathcal X$ be a random element and suppose $g: \mathcal X \to \mathcal Y$ is a bijection between the sets $\mathcal X$ and $\mathcal Y$.

Let $A$ be such that $[X \in A] = \{\omega : X(\omega) \in A\}$ is measurable, and recall that we define $P(X \in A)$ and $P(g(X) \in g(A))$ to be $$P(\{\omega \in \Omega: X(\omega) \in A\}) \quad \text{and} \quad P(\{\omega \in \Omega: g(X(\omega)) \in g(A)\})$$ respectively. However, because $g$ is a bijection, it is true that $x \in A$ if-and-only-if $g(x) \in g(A)$, i.e., $$ \{\omega \in \Omega: X(\omega) \in A\} = \{\omega \in \Omega: g(X(\omega)) \in g(A)\}. $$ Hence, $P(X \in A) = P(g(X) \in g(A))$.

To apply this to your problem, let $\mathcal X = [0, \infty]$ and $\mathcal Y = [-\infty, \infty]$ and $g(x) = \log x$. The set $A$ is $(x, \infty]$ and $g(A) = (\log x, \infty]$.

guy
  • 8,892
1

You can make your transformation more general. It just needs to be measurable.

If you define $\mathbb{P}_Y(Y \in A) = \mathbb{P}_Y(g(X) \in A) = \mathbb{P}_X(X \in g^{-1}(A))$ for a measureable $g$, where pre-image is defined as $g^{-1}(A) = \{ x : g(x) \in A\}$, then I think reasons why $\mathbb{P}_Y$ is still a valid probability measure is because the pre-image satisfies some properties, namely:

  1. $g^{-1}\left(\bigcup_i A_i\right) = \bigcup_i g^{-1}(A_i)$, and
  2. $g^{-1}(A^c) = \left[g^{-1}(A)\right]^c$.

This will mean $\mathbb{P}_Y$ is a valid probability measure because it satisfies all the axioms still, and so it will work for any set in the sigma field you throw into it, in particular the one you're looking at.

Taylor
  • 20,630
  • When writing my answer I realized you actually don't need $g$ to be measurable, or indeed very much measurablility at all. As long as $[X \in A]$ is measurable and you are working with a bijection you get trivially that $[g(X) \in g(A)]$ is measurable, even if $g(x)$ is not measurable. And $[X \in A]$ can be measurable without requiring that $A$ is measurable (with respect to whatever $\sigma$-field you are working with on $\mathcal X$). – guy Sep 22 '18 at 02:38
  • @Taylor How do you define a measurable function without referring to measurable sets? – Juho Kokkala Sep 22 '18 at 06:59
  • @JuhoKokkala ah yep – Taylor Sep 22 '18 at 14:02
  • @guy bijective functions are always measurable, so this is more general. Also, you use the pushforward measure without showing it exists. This is why it exists. – Taylor Sep 22 '18 at 14:07
  • Do you have a source for bijection always being measurable? That seems unlikely, since bijections are fundamentally non-measure-theoretic objects, and are not necessarily connected to the underlying $\sigma$-algebras. My answer also goes through without referencing any $\sigma$-algebra other than $\mathcal F$ by making the assumption that $[X \in A] \in \mathcal F$. I don’t use any measure or algebra other than $P$ and $\mathcal F$. – guy Sep 22 '18 at 14:54
  • My heuristic for bijections not necessarily being measurable: there should $|2^{\mathbb R}|$ bijections from $\mathbb R$ to $\mathbb R$, but if I recall correctly there are only $|\mathbb R|$ many Borel measurable functions, so most bijections should not be Borel measurable. All bijections being Borel measurable also seems to violate Lusin’s theorem. – guy Sep 22 '18 at 14:59