Example Problems on Convergence of Random Variables
ST779 Homework 7 on Convergence of Random Variables
1
Let Xn, X, Y be random variables on a probability space (Ω,A,P) satisfying ∣Xn∣≤Y for all n, Xn→X and E(Y2)<∞. Then show that E[(Xn−X)2]→0.
Notice that ∣Xn∣≤Y⇒X2n≤Y2. Then E(X2n)≤E(Y2)≤∞ ∀n. Since this holds for all n, supnE(X2n)≤E(Y2)≤∞. Now take ψ(∣Xn∣)=X2n. Then, ψ(∣Xn∣)∣Xn∣↑∞ as ∣Xn∣→∞. Thus, Xnn is uniformly integrable.
Now, since Xn→X almost surely, and g(X)=X2 is a continuous function, by the Continuous Mapping Theorem X2n→X2 almost surely.
Finally, by since X2n is uniformly integrable,
E(∣(Xn−X)2∣)=E((Xn−X)2)→0.2
Let X be a random variable on a probability space (Ω,A,P). Define the moment generating function fo X by ϕ(λ)=E(eλX). Let Λ={λ∈R:ϕ(λ)<∞}. Show that ϕ(λ) is continuous at any interior point of Λ.
We want to show that for λn↑λ0∈Λ ∃δ,
∣λn−λ0∣<δ→∣ϕ(λn)−ϕ(λ0)∣<εfor some n>N and ε>0. Notice λn↑λ0⇒λnX↑λ0X⇒eλnX↑eλ0X.
Since λ0∈Λ, E[eλ0X]<∞. Thus, supnE[eλnX]<∞. So, eλnX is uniformly integrable. Thus,
∣ϕ(λnX)−ϕ(λ0X)∣=∣E[eλnX]−E[eλ0X]∣=∣E[eλnX−eλ0X]∣≤E(∣eλnX−eλ0X∣)Jensen's→0uniform integrabilityThen, for n large enough, ∣E[eλnX]−E[eλ0X]∣<ε. So take δ=∣eλn+1X−eλ0X∣.
3
Let Xn and Yn be random variables independent of each other for all n, and that Xn→X, Yn→Y (pointwise convergence). Show that X and Y are independent.
Define g(X)=eitX. We know that g(Xn)→g(X) and g(Yn)→g(Y). Also, eitX=cos(tX)+isin(tX) and ∣eitX∣≤1 ∀t.
So, we can use the DCT. Since g(Xn)→g(X) pointwise and ∣g(Xn)≤1 and E(1)<∞, then E(g(Xn))→E(g(X)). Thus we have the characteristic function of Xn converging to the characteristic function of X. The same holds for Yn and Y.
Then,
ϕ(Xn,Yn)=E(eitXn+iuYn)=E(eitXn)E(eiuYn)=ϕ(Xn)ϕ(Yn)limn→∞ϕ(Xn,Yn)=limn→∞ϕ(Xn)ϕ(Yn)=ϕ(X)ϕ(Y)=E(eitX)E(eiuY)=E(eitX+iuY)=ϕ(X,Y)Thus, X and Y are independent.
4
Let X b an integrable random variable on a probability space (Ω,A,P) and An,A∈A such that P(AnΔA)→0 as n→∞. Show that ∫AnXdP→∫AXdP.
Since P(AnΔA)→0, we know
P(AnΔA)=P((An∖A)∪(A∖An))=P((An∖A))+P((A∖An))→0.Thus, P(An∖A)→0 and P(A∖An)→0. Now we can look at ∫AnXdP.
∫AnXdP=∫(An∩A)∪(An∩AC)XdP=∫(A∖(A∖An))∪(An∖A)XdP=∫AXdP−∫An∖AXdP+∫A∖AnXdP=E[X1A]−E[X1A∖An]+E[X1An∖A]Since X is integrable, we know that ∃M such that ∀ε>0, ∫∣X∣>MXdP<ε. Now we can look at one of the expectations.
E[X1A∖An]=E[X1A∖An∩∣X∣>M]+E[X1A∖An∩∣X∣≤M]=∫A∖An∩∩∣X∣>MXdP+∫A∖An∩∩∣X∣≤MXdP<∫∣X∣>MXdP+M∫A∖AndP→0Since, ∫A∖AndP→0. Similarly, E[X1An∖A]→0. Thus,
∫AnXdP=∫AXdP−∫An∖AXdP+∫A∖AnXdP→∫AXdP−0+0=∫AXdP.5
Let Xn, Yn, X, Y be random variables on a probability space (Ω,A,P) satisfying 0≤Xn≤Yn for all n, Xn→X, Yn→Y (pointwise convergence) and E(Yn)→E(Y)<∞. Then show that E(Xn)→E(X).
Notice Xn−Yn≤0 and E(0)=0<∞. Then by Fatou’s Lemma,
E[lim supXn−Yn]≥lim supE[Xn−Yn]E[lim supXn]−E[lim supYn]≥lim supE[Xn]−lim supE[Yn]E[X]−E[Y]≥lim supE[Xn]−E[Y]E[X]≥lim supE[Xn].Notice 0≤Xn and E[0]=0<∞. Again, by Fatou’s Lemma,
E[X]=E[lim infXn]≤lim infE[Xn].Then,
E[X]≤lim infE[Xn]≤lim supE[Xn]≤E[X].Thus, E[Xn]→E[X].