Seminar 5

\[\newcommand{\st}{\, : \:} \newcommand{\ind}[1]{\mathbf{1}_{#1}} \newcommand{\dd}{\mathrm{d}}\]

Exercise 1

Let \(\alpha \sim \mathrm{Uniform}([0,1])\). Find the following functions:

  1. The probability density function \(p_\beta(x)\), if the random variable \(\beta\) is such that \(\beta=3\alpha-1\).
  2. The probability density function \(p_\gamma(x)\), if the random variable \(\gamma\) is such that \(\gamma=-\ln(\alpha)\).
  3. The probability density function \(p_\kappa(x)\), if the random variable \(\kappa\) is such that \[ \kappa= \begin{cases} 1+\alpha+\alpha^2+\ldots &\alpha \in(0,1) \\ 0 &\alpha\not\in(0,1) \end{cases} \]
  4. The probability density function \(p_\epsilon(x)\), if \[ \epsilon = \begin{cases} \sum_{j=0}^\infty(-1)^j\alpha^j & \alpha \in(0,1) \\ 0 & \alpha\not\in(0,1) \end{cases} \]
  5. The cumulative distribution function \(F_\rho(x)\), if the random variable \(\rho\) is such that \[ \rho= \begin{cases}1 &\text{if $\alpha$ is irrational} \\ 0 &\quad \text{if $\alpha$ is rational} \end{cases} \]

The density of \(\alpha\) is \(p_\alpha(t) = \mathbf{1}_{[0,1)}(t)\).

  1. \(\beta = 3\alpha-1\). This is a linear transformation. \(\alpha = (\beta+1)/3\). The range for \(\beta\) is \([-1, 2]\). \(p_\beta(x) = p_\alpha\left(\frac{x+1}{3}\right) \left|\frac{d\alpha}{d\beta}\right| = 1 \cdot \frac{1}{3} = \frac{1}{3}\) on \([-1,2]\). This is \(\mathrm{Uniform}([-1,2])\). In general, it is immediate to check that an affine transformation of a uniform random variable is uniform (on the interval being given by the same affine function).

  2. \(\gamma = -\ln(\alpha)\), \(\alpha = e^{-\gamma}\). The range for \(\gamma\) is \([0, \infty)\). \(p_\gamma(x) = p_\alpha(e^{-x}) \left|\frac{d\alpha}{d\gamma}\right| = 1 \cdot |-e^{-x}| = e^{-x}\) on \([0, \infty)\). I.e. \(\gamma \sim \exp(1)\).

  3. \(\kappa = \frac{1}{1-\alpha}\) for \(\alpha \in (0,1)\). \(\alpha = 1 - 1/\kappa\). The range for \(\kappa\) is \((1, \infty)\). \(p_\kappa(x) = p_\alpha(1-1/x) \left|\frac{d\alpha}{d\kappa}\right| = 1 \cdot |1/x^2| = 1/x^2\) on \((1, \infty)\).

  4. \(\epsilon = \frac{1}{1+\alpha}\) for \(\alpha \in (0,1)\), \(\alpha = 1/\epsilon - 1\). The range for \(\epsilon\) is \((1/2, 1)\). \(p_\epsilon(x) = p_\alpha(1/x-1) \left|\frac{d\alpha}{d\epsilon}\right| = 1 \cdot |-1/x^2| = 1/x^2\) on \((1/2, 1)\).

  5. \(\rho=1\) a.s., thus \(F_\rho(x) = \mathbf{1}_{[1, \infty)}(x)\).

Exercise 2

The random variable \(\alpha\) is uniform on the interval \([-1,3]\), find the density of \(-|\alpha|\).

If we visualize it graphically, \(-|\alpha|\) pushes forward the density at points \(x>0\) to the same at \(-x\). The density of \(-|\alpha|\) is this \(\tfrac{1}{4} \mathbf{1}_{[-3,-1)}+\tfrac{1}{2}\mathbf{1}_{[-1,0)}\). We can also solve it analytically, indeed for \(y< 0\) \[ f_{-|\alpha|}(y)= \frac{d}{dy} \mathbb{P}(-|\alpha| \le y) = \frac{d}{dy} \mathbb{P}( \alpha \ge -y)+ \frac{d}{dy} \mathbb{P}( \alpha \le y) = \tfrac{1}{4} \mathbf{1}_{[-3,-1)}(y)+\tfrac{1}{2}\mathbf{1}_{[-1,0)}(y) \]

Exercise 3

The random variable \(\alpha\) is uniform on the interval \([-1,3]\), find the cumulative distribution function for \(\frac{|\alpha|}{\alpha}\).

Since \(\mathbb{P}(\alpha=0)=0\), we have that \(\frac{|\alpha|}{\alpha}\) is just the sign of \(\alpha\). So it \(-1\) with probability \(1/4\) and \(+1\) with probability \(3/4\). The distribution function is \(\tfrac{1}{4} \mathbf{1}_{[-1,1)}+\mathbf{1}_{[1,\infty)}\).

Exercise 4

A random variable \(\alpha\) is uniform on the interval \([-1,1]\), and a random variable \(\beta\), independent of \(\alpha\), is a Bernoulli variable with parameter \(p=\frac{1}{3}\).

  1. Find the cumulative distribution function of the random variable \(\alpha \beta\).
  2. Find the cumulative distribution function of the random variable \(|\alpha|\beta\).
  3. Find the cumulative distribution function of the random variable \(|2\alpha-1| \beta\).
  1. \(F_{\alpha \beta}(x)=\frac{x+1}{6} \mathbf{1}_{[-1,0)}(x) + \frac{x+5}{6} \mathbf{1}_{[0,1)}(x) + \mathbf{1}_{[1,\infty)}(x)\).
  2. \(F_{|\alpha| \beta}(x)= \frac{x+2}{3} \mathbf{1}_{[0,1)}(x)+ \mathbf{1}_{[1,\infty)}\).
  3. \(F_{|2\alpha-1| \beta}(x) = \frac{4+x}{6} \mathbf{1}_{[0,1)}(x)+\frac{9+x}{12} \mathbf{1}_{[1,3)}(x) +\mathbf{1}_{[3,\infty)}(x)\).

Exercise 5

A random variable \(\alpha\) is uniform on the interval \([0,1]\), and the random variable \(\beta\) is independent of \(\alpha\).

  1. Find the probability density function of the random variable \(2\alpha-\beta\), if \(\beta\) is distributed according to the exponential law with parameter \(1\).
  2. Find the cumulative distribution function of the random variable \(\alpha+\beta\), if \(\beta\) is discrete and distributed according to the Poisson law with parameter \(\lambda\).
  3. Find the cumulative distribution function of the random variable \(\alpha+2\beta\), if \(\beta\) is a geometric random variable with parameter \(p\).
  1. Let \(\xi=2\alpha-\beta\). The density of \(2\alpha\) is \(f_{2\alpha}(x)=\tfrac{1}{2}\mathbf{1}_{[0,2)}\). The density of \(-\beta\) is \(f_{-\beta}(x)=e^x \mathbf{1}_{(-\infty, 0)}\). The density of the sum \(\xi\) is the convolution: \[ f_\xi(y) = \int_{-\infty}^\infty f_{2\alpha}(x) f_{-\beta}(y-x) dx=\frac{\mathbf{1}_{(-\infty,2)}(y)}{2} \int_{\max(0,y)}^2 e^{y-x} dx = \begin{cases} \frac{e^y(1-e^{-2})}{2} & \text{if $y<0$.} \\ \frac{1-e^{y-2}}{2} & \text{if $y\in [0,2)$.} \\ 0 & \text{if $y\ge 2$.} \end{cases} \]

  2. Let \(\eta=\alpha+\beta\), then \[ F_\eta(x) = \sum_{k=0}^\infty \mathbb{P}(\eta \le x | \beta=k)\mathbb{P}(\beta=k) = \sum_{k=0}^\infty \mathbb{P}(\alpha \le x-k) \frac{e^{-\lambda}\lambda^k}{k!}= \sum_{k=0}^{\lfloor x \rfloor-1} \frac{e^{-\lambda}\lambda^k}{k!} + (x- \lfloor x \rfloor) \frac{e^{-\lambda}\lambda^{\lfloor x \rfloor}}{{\lfloor x \rfloor}!} \]

  3. Let \(\zeta=\alpha+2\beta\). \(\mathbb{P}(\beta=k)=(1-p)^k p\) for \(k=0,1,\dots\). Thus \(F_\zeta(x) = \sum_{k=0}^\infty \mathbb{P}(\alpha \le x-2k) (1-p)^k p\). So similarly to point b., \(F_\zeta\) is piecewise affine, interpolating among the values of the \(F_{2\beta}(x)\) at even integers points \(x=2k\).

Exercise 6

A random variable \(\gamma\) is distributed according to the exponential law with parameter \(a\), a random variable \(\theta\) is also distributed according to the exponential law with parameter \(b\), and \(\gamma\), \(\theta\) are independent.

  1. Find the probability density function of the r.v. \(\sqrt{\gamma}\)
  2. Find the probability density function of the r.v. \(\gamma^2\)
  3. Find the probability density function of the r.v. \(1-e^{-a\gamma}\)
  4. Find the probability density function of the r.v. \(\max(\gamma,\theta)\)
  5. Find the probability density function of the r.v. \(\min(\gamma,\theta)\)
  6. Find the probability density function of the r.v. \(\gamma+\theta\)
  1. \(F(y) = \mathbb{P}(\gamma \le y^2) = 1-e^{-ay^2}\) for \(y \ge 0\). \(p(y)=2aye^{-ay^2}\).
  2. \(F(y) = \mathbb{P}(\gamma \le \sqrt{y}) = 1-e^{-a\sqrt{y}}\) for \(y \ge 0\). \(p(y)=\frac{a}{2\sqrt{y}}e^{-a\sqrt{y}}\).
  3. \(\zeta=1-e^{-a\gamma} = F_\gamma(\gamma)\). As we know, this transformation yields \(\mathrm{Uniform}([0,1])\) for any continuous random variables \(\gamma\).
  4. \(F_{\max}(x) = F_\gamma(x)F_\theta(x) = 1-e^{-ax}-e^{-bx}+e^{-(a+b)x}\). \(p_{\max}(x) = ae^{-ax}+be^{-bx}-(a+b)e^{-(a+b)x}\) for \(x \ge 0\).
  5. \(F_{\min}(x) = 1-(1-F_\gamma(x))(1-F_\theta(x)) = 1-e^{-(a+b)x}\). Namely the minimum is exponential of parameter \(a+b\).
  6. If \(a \neq b\), \(p_{\gamma+\theta}(y) = \frac{ab}{b-a}(e^{-ay}-e^{-by})\). If \(a=b\), \(p_{\gamma+\theta}(y) = a^2 y e^{-ay}\).

Exercise 7*

Let \(X_1,X_2\ldots\) be independent random variables, with the same distribution \(\mathrm{exp}(\lambda)\). Let \(Y_n:=\sum_{i=1}^n X_i\) and \(N_t:=\inf\{ n\ge 0 \,:\:Y_{n+1} >t \}\), \(t>0\).

  1. Prove that the distribution of \(Y_n\) has the density \(\rho_n(y):= e^{-\lambda y} \frac{\lambda^n y^{n-1}}{(n-1)!} \mathbf{1}_{y\ge 0}\).
  2. Prove that \(\mathbb{P}(N_t=k)=e^{-\lambda t} (\lambda t)^k/k!\) (this means that \(N_t \sim \mathrm{Poisson}(\lambda t)\)).
  1. Proceed by induction.
    • For \(n=1\), \(Y_1=X_1\), thus \(\rho_1(y) = \lambda e^{-\lambda y}\).
    • Assume the formula is true for \(n\). \(Y_{n+1}=Y_n+X_{n+1}\), a sum of the independent random variables, and the density of \(Y_{n+1}\) is the convolution of their densities: \[ \begin{aligned} \rho_{n+1}(y) & = \int_0^y \rho_n(x)\rho_1(y-x)dx = \int_0^y \left(e^{-\lambda x}\frac{\lambda^n x^{n-1}}{(n-1)!}\right)(\lambda e^{-\lambda(y-x)})dx \\ & = \frac{\lambda^{n+1}e^{-\lambda y}}{(n-1)!} \int_0^y x^{n-1}dx = e^{-\lambda y}\frac{\lambda^{n+1}y^n}{n!} \end{aligned} \]
  2. Notice that \(\rho_{n+1}'=-\lambda(\rho_{n+1}-\rho_n)\). Thus \[ \begin{aligned} \mathbb{P}(N_t=n) & = \mathbb{P}(N_t < n+1) - \mathbb{P}(N_t < n) = \mathbb{P}(Y_{n+1}>t) - \mathbb{P}(Y_n>t) = \int_t^{\infty} \rho_{n+1}(y)-\rho_n(y) dy=- \int_t^\infty \frac{d}{dy} \left(\frac{1}{\lambda} \rho_{n+1}(y) \right) dy= \tfrac{1}{\lambda}\rho_{n+1}(t) \end{aligned} \] which is the statement to be proved.

Exercise 8

A point \((x,y)\) is chosen from the square \([0,1]\times [0,1]\) uniformly. Find the distribution of the random variables

  1. \(x^2\).
  2. \(x/(x+y)\).
  3. \(x^2+y^2\).
  4. \(\min(x,y)\).
  5. \(\max(x,y)\).
  1. Set \(\xi:=x^2\). Then \(F_\xi(z)=\mathbb{P}(x^2\le z)=\sqrt{z}\) and \(f_\xi(z)=1/(2\sqrt{z})\).
  2. Set \(\xi=x/(x+y)\). \(\xi\) has the same law as \(1-\xi=y/(x+y)\). So the density satisfies, \(f_\xi(z)=f_\xi(1-z)\). Thus take \(z\le 1/2\), and notice \[ \mathbb{P}(\xi \le z)= \mathbb{P}(x \le zy/(1-z))=z/(2(1-z)) \] since this is the area of a triangle with height \(1\) and base \(z/(1-z)\). In particular the density is \(f_\xi(z)=2(1+|2z-1|)^{-2}\) for \(z\in [0, 1]\).
  3. Set \(\xi=x^2+y^2\). For \(z\in[0,1]\) \[ F_\xi(z)=\mathbb{P}(x^2+y^2\le z) = \text{Area of quarter circle of radius $\sqrt{z}$} = \pi z/4 \] For \(z\in(1,2]\), the set \(\{x^2+y^2\le z\}\) is the union of two triangles and a circular sector. The triangles have height \(1\) and base \(\sqrt{z} \sin(\arccos{z^{-1/2}})\). The circular sector spans an angle \(\pi/2-2 \arccos(z^{-1/2})\). So if \(z\in (1,2]\) \[ F_\xi(z)= \sqrt{z-1}+z(\pi/4-\arccos{z^{-1/2}}) \] In particular \(f_\xi(z)= \tfrac{\pi}{4}- \arccos(z^{-1/2}) \mathbf{1}_{[1,2)}(z)\).
  4. Set \(\xi =\min(x,y)\). \(F_\xi(z)=1-\mathbb{P}(\min(x,y)>z)=1-(1-z)^2=2z-z^2\) and \(f_\xi(z)=2(1-z)\).
  5. Set \(\xi=\max(x,y)\). \(F_\xi(z)=z^2\) and \(f_\xi(z)=2z\).

Exercise 9

Let the random vector \((\alpha,\beta)\) be uniformly distributed in the region \(\mathcal{G}=\left\{|x|+|y| < 1\right\}\). That is, the corresponding two-dimensional probability density is \[ f_{(\alpha,\beta)}(x,y)= \begin{cases} \mathrm{const} & x,y \in\mathcal{G} \\ 0 & x,y\not\in\mathcal{G} \end{cases} \tag{1}\]

  1. What is the value of the constant in the formula?
  2. Find the densities \(f_\alpha(x)\), \(f_\beta(y)\) of the distribution of the first coordinate \(\alpha\) and the second coordinate \(\beta\) of the vector.
  3. Are \(\alpha\) and \(\beta\) dependent?
  4. Find the probability densities for \(\alpha+\beta\) and for \(\alpha-\beta\).
  1. The constant is \(1/|\mathcal{G}|=1/2\).
  2. The density of \(\alpha\) is obtained as pushforwarding the uniform measure on \(\mathcal{G}\) on the segment \([-1,1]\). So a graphical visualization immediately shows \(f_\alpha(x)=f_\beta(x)= (1-|x|) \mathbf{1}_{[-1,1]}\). We can also find this by computing \(f_\alpha(x)=\int f_{\alpha,\beta}(x,y)dy\).
  3. They are dependent, e.g. for \(x\ge 1/2\), \(\mathbb{P}(\alpha> x,\beta>x)=0\), while \(\mathbb{P}(\alpha> x)=\mathbb{P}(\beta>x)>0\). More in general, we observe that if \((\alpha,\beta)\) are distributed as in Equation 1, they are independent iff \(\mathcal{G}=\mathcal{G}_1 \times \mathcal{G}_2\) (up to a.e. equivalence) and \(\alpha\), \(\beta\) are uniform on \(\mathcal{G}_1\), \(\mathcal{G}_2\) respectively.
  4. Let \(U=\alpha+\beta, V=\alpha-\beta\). This is a rotation and scaling. The region \(|x|+|y|<1\) is transformed into the region \(\mathcal{G}'=\{|u|<1, |v|<1\}\). The Jacobian of the transformation from \((u,v)\) to \((x,y)\) is \(1/2\) and thus \((U,V)\) is uniform on \(\mathcal{G}'\). In particular they are i.i.d. and uniformly distributed on \([-1,1]\).

Exercise 10

Let the random vector \((\alpha,\beta)\) be uniformly distributed in the upper semicircle \(\mathcal{G}=\left\{x^2+y^2 < 1, y> 0\right\}\). That is, the corresponding two-dimensional probability density is \[ f_{(\alpha,\beta)}(x,y)= \begin{cases} \mathrm{const} & x,y \in\mathcal{G} \\ 0 & x,y\not\in\mathcal{G} \end{cases} \]

  1. What is the value of the constant in the formula?
  2. Find the density \(f_\alpha(x)\) of the first coordinate \(\alpha\) of the vector.
  3. Find the probability density for \(\rho=\sqrt{\alpha^2+\beta^2}\). Draw the graph of \(f_\rho(t)\).
  4. Find the probability density for \(\phi=\arccos(\alpha/\sqrt{\alpha^2+\beta^2})\). Draw the graph of \(f_\phi(t)\).
  5. Are \(\rho\) and \(\phi\) dependent?
  6. Find the probability density for \(\xi=\alpha/\beta\). Draw the graph of \(f_\xi(t)\).
  7. Find the probability density for \(\eta=\alpha^2/\beta^2\). Draw the graph of \(f_\eta(t)\).
  8. Find the probability density for \(\theta=\alpha^2+\beta^2\). Draw the graph of \(f_\theta(t)\).
  1. The constant is \(1/|\mathcal{G}|=2/\pi\).
  2. The density of \(\alpha\) is obtained as pushing forward the uniform measure on \(\mathcal{G}\) on the segment \([-1,1]\). So a graphical visualization immediately shows \(f_\alpha(x)=\frac{2}{\pi}\sqrt{1-x^2} \mathbf{1}_{[-1,1]}\). We can also find this by computing \(f_\alpha(x)=\int f_{\alpha,\beta}(x,y)dy\).
  3. \(\rho\) is the polar radius. \(F_\rho(r) = \mathbb{P}(\rho\le r) = (\pi r^2/2)/(\pi/2) = r^2\) for \(r\in[0,1]\). So \(f_\rho(r)=2r \mathbf{1}_{[0,1)}\).
  4. \(\phi\) is the polar angle. It is uniformly distributed on \([0,\pi]\).
  5. In polar coordinates, the joint density is \(f(r,\phi) = (2/\pi) \cdot r\). Therefore \(\rho\) and \(\phi\) are independent.
  6. \(\xi=\alpha/\beta = \cot(\phi)\). \(F_\xi(x) = \mathbb{P}(\cot\phi \le x) = \mathbb{P}(\phi \ge \operatorname{arccot}(x)) = \frac{\pi-\operatorname{arccot}(x)}{\pi}\). \(f_\xi(x) = \frac{1}{\pi(1+x^2)}\) (known as Cauchy distribution).
  7. \(\eta=\xi^2\). \(F_\eta(y)=\mathbb{P}(\xi^2\le y)=F_\xi(\sqrt{y})-F_\xi(-\sqrt{y}^-)\). \(f_\eta(y)=\frac{1}{\pi\sqrt{y}(1+y)} \mathbf{1}_{[0,\infty)}(y)\).
  8. \(\theta=\rho^2\). \(F_\theta(t)=\mathbb{P}(\rho^2\le t)=(\sqrt{t})^2=t\) for \(t\in[0,1]\) (uniform distribution).