Example Problems on Random Variables

ST779 Homework 5 on Random Variables

1

Let $X$ be a real-valued random variable. Given any $\epsilon > 0$, show that there exists and an $M>0$ and a random variable $\mid Y \mid \leq M$ and $P(X\neq Y) < \epsilon$.

We will proceed using the approximation theorem. That is, define

\[f_n(\omega) = \begin{cases} 2^n & \text{if } f(\omega) \geq 2^n \\ -2^n & \text{if } f(\omega) < -2^n \\ k 2^{-n} & \text{otherwise.} \end{cases}\]

And $f_n(\omega) \rightarrow f(\omega) = X$. That is, $\forall \omega$ and $\varepsilon > 0$ $\exists N$ large enough and $n \geq N$ such that

\[\mid f_n(\omega) - f(\omega) \mid < \varepsilon.\]

Then, $\exists K$ large enough such that $k \geq K$ such that

\[\mid f_n(\omega) - f(\omega) \mid > \varepsilon \text{ for finitely many } \omega.\]

Then take $Y = f_K(\omega)$. Notice that this is bounded by $M = 2^K$.

2

Let $X:[-1,1] \rightarrow [0,1]$ be a random variable defined be $X(\omega) = \omega^2$ for $- 1 \leq \omega < 0$. and $X(\omega) = \omega^3$ for $0 \leq \omega \leq 1$. Let $P$ be the uniform distribution on $[-1,1]$, i.e. $P(B) = \lambda(B) / 2$, $B$ and Borel subset of $[-1,1]$ and $\lambda$ the Lebesgue measure. Find $P_X([a,b])$, where $[a,b]$ is a Borel subset of $[0,1]$ and $P_X$ is the induced distribution, i.e., $P_X(C) = P(X^{-1}(C))$, $C$ a Borel subset of $[0,1]$.

Our $X^{-1}(\omega)$ looks like

\[\begin{align} X^{-1}(\omega) & = \left\{ \omega : X(\omega) \leq x \right\} \\ & = \left\{ \omega : \begin{cases} \omega^2 \leq x & -1 \leq \omega < 0 \\ \omega^3 \leq x & 0 \leq \omega \leq 1 \end{cases} \right\} \\ & = \left\{ \omega : \begin{cases} \omega \leq x^{1/2} & -(x^{1/2}) \leq \omega < 0 \\ \omega \leq x^{1/3} & 0 \leq \omega \leq x^{1/3} \leq 1 \end{cases} \right\}. \end{align}\]

Now we can apply out Lebesgue measure to get the induced distribution.

\[\begin{align} P_X([a,b]) & = P(X^{-1}([a,b])) \\ & = \begin{cases} \frac{ -(b^{1/2}) - -(a^{1/2}) }{ 2 } & \omega \in [- \sqrt{ b }, - \sqrt{ a }] \\ \frac{ b^{1/3} - a^{1/3} }{ 2 } & \omega \in [a^{1/3}, b^{1/3}] \end{cases} \end{align}\]

3

Let $\Omega$ be a sample space and $\mathcal A$ be a $\sigma$-field on $\Omega$. Let $X$ be a random variable. Show that the induced $\sigma$-field on $\sigma \langle X \rangle = \{ X^{-1}(B): B\in \mathcal R \}$ is countably generated.

Recall that $\mathcal{ R }$ is countably generated by $\{ (r,s): -\infty \leq r \leq s \leq \infty; r,s \in \mathcal{ Q } \}$. Now notice that

\[X^{-1}(B) = \left\{ \omega : X(\omega) \in B) , \ \forall B \in \mathcal{ R }\right\}.\]

So take

\[\mathcal{ C}_X = X^{-1}\Big( \big\{ \omega : X(\omega) \in B) \forall B \in \mathcal{ R } \big\} \Big).\]

We can see that $\mathcal{ C }_X$ is countably generated, now we need to show that $\sigma \langle \mathcal{ C }_X \rangle = \sigma \langle X \rangle$. Recall that $\mathcal{ A } = \sigma \langle \mathcal{ C }_X \rangle$. We will proceed with the Good Sets Principle.

\[\mathcal{ G } = \{ G \in \sigma \langle \mathcal{ C }_X \rangle : G \in \sigma \langle X \rangle\} \subset \sigma \langle \mathcal{ C }_X \rangle\]

(i) $\mathcal{ C }_X$ is a generator for $\mathcal{ A }$ by defintion. ✅

(ii) Take $X^{-1}[(r,s)] \in \mathcal{ C }_X$. Since $(r,s) \in \mathcal{ R }$, $X^{-1}[(r,s)] \in \sigma \langle X \rangle$. So, $\mathcal{ C }_X \in \mathcal{ G }$.

(iii) Showing $\mathcal{ G }$ is a $\sigma$-field.

(a) Since $\varnothing, \Omega \in \sigma \langle X \rangle$ $\Rightarrow$ $\varnothing , Omega\in \mathcal{G}$

(b) Take $G_1, G_2, \dots \in \mathcal{G}$. Then $G_1, G_2, \dots \in \sigma \langle X \rangle$. Since it is a $\sigma$-field, $\bigcup_{i=1}^{\infty} G_i \in \sigma \langle X \rangle$. Thus $\bigcup_{i=1}^{\infty} G_i \in \mathcal{G}$.

(c) Take $G \in \mathcal{G}$ then $G \in \sigma \langle X \rangle$. Then $G^C \in \sigma \langle X \rangle$ because $\sigma \langle X \rangle$ is a $\sigma$-field. Thus, $G^C \mathcal{G}$. ✅

So, by the Good Sets Principle $\sigma \langle X \rangle = \sigma \langle \mathcal{C}_X \rangle$.

4

Let $a<b$ be real numbers. Construct a sequence of continuous functions $f_n: \mathbb R \rightarrow \mathbb R$ such that $f_n(x) \rightarrow \mathbb I_{[a,b]}(x)$ as $n\rightarrow \infty$ for all $x$.

What about intervals $(a,b)$ and $(a,b]$?

We need a continuous function that converges to 0 outside of $[a,b]$ and converges to $1$ inside of $[a,b]$. We can accomplish this with

\[f_n(x) = \begin{cases} e^{-n(x-a)^2} & x<a \\ 1 & a \leq x \leq b \\ e^{-n(x-b)^2} & b < x. \end{cases}\]

For interval $(a,b)$ we can change the inequalities to

\[f_n(x) = \begin{cases} e^{-n(x-a)^2} & x\leq a \\ 1 & a < x < b \\ e^{-n(x-b)^2} & b \geq x. \end{cases}\]

And for interval $(a,b]$

\[f_n(x) = \begin{cases} e^{-n(x-a)^2} & x\leq a \\ 1 & a < x \leq b \\ e^{-n(x-b)^2} & b < x. \end{cases}\]

5

Let $\mathcal A_n$, $n \geq 1$, be a sequence of $\sigma$-fields on $\Omega$. Show that the $\{ \mathcal A_n: n = 1, 2, \dots \}$ is mutually independent if and only if for all $n$, $\mathcal A_n$ is indpendent of $\sigma \langle \mathcal A_1, \dots , \mathcal A_{n-1} \rangle$.

State in terms of random variables, $\{ X_n: n = 1, 2, \dots \}$ is mutually independent if and only if $X_n$ is independent of $\{ X_k: k = 1, 2, \dots, n-1 \}$, with the connection $\mathcal A_n = \sigma \langle X_n \rangle$, the $\sigma$-field induced by $X_n$.

Let’s first assume that $\{ \mathcal A_n: n = 1, 2, dots \}$ is mutually independent. We want to show that for $C \in \sigma \langle \mathcal{ A }_1, \dots , \mathcal{ A}_{n-1} \rangle$ is independent of $\mathcal{ A }_n$ for arbitrary $n$. Take $A_i \in \mathcal{ A }_i$.

\[\begin{align} P(A_n \cap C) & = P(A_n \cap A_1 \cap \dots \cap A_{n-1}) \\ & = P(A_n) P(A_1) \dots P(A_{n-1}) & \text{by mutual independence of all} \mathcal{ A }_i \\ & = P(A_n) P(A_1 \cap \dots \cap A_{n-1}) \\ & = P(A_n) P(C) ✅ \end{align}\]

Thus, for all $n$, $\mathcal A_n$ is indpendent of $\sigma \langle \mathcal A_1, \dots , \mathcal A_{n-1} \rangle$.

Now let’s assume for all $n$, $\mathcal A_n$ is independent of $\sigma \langle \mathcal A_1, \dots , \mathcal A_{n-1} \rangle$. In order to show that $T = \{ \mathcal A_n: n = 1, 2, \dots \}$ is mutually independent, we need to show that all finite subsets of $T$ are mutually independent. Take an arbitrary, finite set of indices $J = \{ j_1, \dots, j_m \}$. Without loss of generality, assume that $j_1 < \dots < j_m$. However, notice that

\[\{ \mathcal{ A }_{j_1}, \dots , \mathcal{ A }_{j_m} \} \subset \{ \mathcal{ A_1 }, \dots \mathcal{ A_{j_m} } \}.\]

We know that $\mathcal A_{j_m}$ is independent of $\sigma \langle \mathcal A_1, \dots , \mathcal A_{j_{m} - 1} \rangle$, so $\{ \mathcal{ A }_{j_1}, \dots , \mathcal{ A }_{j_m} \}$ are also mutually independent. So we have show that an arbitrary finite subset of $T$ is mutually indepdent, so $\{ \mathcal A_n: n = 1, 2, \dots \}$ is mutually independent.