2

Let $x, x^{(0)}, x^{(1)} \in \mathbb{R}^2$, $r_1, r_2 \sim U[-1, 1]$. Point $x$ is fixed, $x = (0.32, 0)$, and $x^{(0)} = (0, r_1)$, $x^{(1)} = (1,r_2)$. What is the probability that the fixed point $x = (0.32, 0)$ will be closer to $x^{(1)}$ than to $x^{(0)}$, by euclidean distance?

Attempt

$x_0 \sim U[-1, 1]$, $x_1 \sim [0, 2]$, let $x_0' = (x_0 - x) \sim [-1.32, 0.68], x'_1 = (x_1 - x) \sim [-0.32, 1.68]$. Thus, we need to find $P(|x'_0| < |x'_1|)$. Any suggestions? I believe it could be dumbed down to a simple double integral.

  • When you add the error to $x^{(0)}$ and $x^{(1)}$, do you draw one value from U[-1,1] and add this value to both or do you draw two independent errors adding the first to $x^{(0)}$ and the second to $x^{(1)}$? – Tyrel Stokes Oct 05 '20 at 03:22
  • @TyrelStokes Two independent errors, adding the first to $x^{(0)}$ and second to $x^{(1)}$. – Inter Veridium Oct 05 '20 at 07:30
  • Could you please explain what you mean by the "Euclidean distance" between a point $x$ in the plane and a point like $x^{(0)}$ or $x^{(1)}$ in the line? – whuber Oct 05 '20 at 14:55
  • @whuber Sorry. I understood it as if we've got $x^{(0)} = (0, 0)$, and by adding noise we set it as $x^{(0)} = (0, r_1)$, where $r_1 \sim U[-1, 1]$, same for $x^{(1)} = (1, r_2)$, $r_2 \sim U[-1, 1]$. – Inter Veridium Oct 05 '20 at 15:16
  • Because that is an unusual meaning of "added noise," please edit your post to explain it. (Most, if not all, readers would understand "adding noise" to mean that $x^{(0)}$ is a number to which a random, zero-expectation value has been added, producing another number, not a point in the plane.) I still cannot make sense of your question with this new interpretation, though: the point $(0,r)$ always has a distance at least $1$ from all points of the form $(1,y)$ and a distance no more than $1$ from $(0,0),$ so isn't the answer trivially zero? – whuber Oct 05 '20 at 18:07
  • @whuber Indeed, but we've also got a fixed point $x = (0.32, 0)$. So, e.g. assume that $x^{(0)} = (0, 0.5)$, $x^{(1)} = (1, 0.5)$. $x^{(0)}$ is closer to point $x$, as $d(x^{(0)}, x) < d(x^{(1)}, x)$. What is the probability that the fixed point $x = (0.32, 0)$ will be closer to $x^{(1)}$ than to $x^{(0)}$? Should I rephrase the question? – Inter Veridium Oct 05 '20 at 18:37

1 Answers1

2

The goal is to find $P(|x_0^\prime| - |x_1^\prime| \leq 0)$, where the distance is understood to be euclidian.

\begin{align} P(|x_0^\prime| - |x_1^\prime| \leq 0) & = P(((-.32)^2 +r_1^2)^{\frac{1}{2}} - ((.68)^2 +r_1^2)^{\frac{1}{2}} \leq 0)\\ &= P((.32^2 +r_1^2)^{\frac{1}{2}} \leq (.68^2 +r_1^2)^{\frac{1}{2}})\\ &= P(r_1^2 - r_2^2 \leq .68^2 - .32^2) \end{align}

The easiest way that I can think of answering this is by considering this the integration of the region $1\{r_1^2 - r_2^2 \leq .68^2 - .32^2\}$ over the joint density of $r_1^2$ and $r_2^2$. It is sufficient to find the density of the marginals since $r_1$ and $r_2$ are independent. For convenience I will write $X = r_1^2$ and $Y = r_2^2$

By the transformation method, we can find the CDF of $X$ in terms of the CDF of $r_1^2$, $P(r_1 \leq z) = \frac{z+1}{2}1\{z \in [-1,1]\}$. Analgously we can do the same for $Y$ and $r_2^2$, so we skip this step

Since $r_1$ has support $[-1,1]$, $X$ has support on $[0,1]$. Thus for $x \in [0,1]$

\begin{align} P(X \leq x) &= P(r_1^2 \leq x)\\ &= P(\{r_1 \leq x^{\frac{1}{2}}\} \cup \{-r_1 \leq x^{\frac{1}{2}}\})\\ &= P(r_1 \leq x^{\frac{1}{2}}) + 1 - P(r_1 \leq -x^{\frac{1}{2}})\\ &= \frac{x^{\frac{1}{2}} +1}{2} + 1 - \frac{1 -x^{\frac{1}{2}}}{2} \end{align}

Since the pdf is the derivative of the CDF with respect to x, the pdf of $Y$ is analagous, and $X \perp \!\!\! \perp Y$ we get

\begin{align} p_X(x) &= \frac{1}{2}x^{\frac{-1}{2}}1\{x \in [0,1]\}\\ p_{X,Y}(x,y) &= \frac{1}{4}x^{\frac{-1}{2}}y^{\frac{-1}{2}}1\{x \in [0,1]\}1\{y \in [0,1]\} \end{align}

Now using this joint density, we can express the original probability as an integral

\begin{align} P(r_1^2 - r_2^2 \leq .68^2 - .32^2) &= P(X -Y \leq .68^2 - .32^2)\\ &= \int\int 1\{X -Y \leq .68^2 - .32^2\}p_{X,Y}(x,y)dxdy\\ &= \int_0^{(.68^2 - .32^2)}\int_0^1 \frac{1}{4}x^{\frac{-1}{2}}y^{\frac{-1}{2}}dxdy + \int_{(.68^2 - .32^2)}^1\int_{y - (.68^2 - .32^2)}^1 \frac{1}{4}x^{\frac{-1}{2}}y^{\frac{-1}{2}}dxdy\\ &= .6 + .19775\\ &= .79775 \end{align}

To check and make sure I didn't make any obvious errors, here in a numerical simulation in R.

set.seed(682322)
a <- -1
b <- 1
n <- 1000000

r1 <- runif(n,a,b) r2 <- runif(n,a,b) c1 <- sqrt(.32^2 + r1^2) c2 <- sqrt(.68^2 + r2^2) mean(ifelse(c1 -c2 < 0,1,0))

0.797967

Tyrel Stokes
  • 1,229
  • 1
    Great answer, thank you. What I had trouble with is joint density and the integration over this particular area, but now it seems pretty clear to me how to approach tasks like these. Thanks again. – Inter Veridium Oct 08 '20 at 14:47