Loading [MathJax]/jax/output/HTML-CSS/jax.js

Example Problems on Convergence of Random Variables

ST779 Homework 7 on Convergence of Random Variables

1

Let Xn, X, Y be random variables on a probability space (Ω,A,P) satisfying Xn∣≤Y for all n, XnX and E(Y2)<. Then show that E[(XnX)2]0.

Notice that Xn∣≤YX2nY2. Then E(X2n)E(Y2) n. Since this holds for all n, supnE(X2n)E(Y2). Now take ψ(Xn)=X2n. Then, ψ(Xn)Xn as Xn∣→. Thus, Xnn is uniformly integrable.

Now, since XnX almost surely, and g(X)=X2 is a continuous function, by the Continuous Mapping Theorem X2nX2 almost surely.

Finally, by since X2n is uniformly integrable,

E((XnX)2)=E((XnX)2)0.

2

Let X be a random variable on a probability space (Ω,A,P). Define the moment generating function fo X by ϕ(λ)=E(eλX). Let Λ={λR:ϕ(λ)<}. Show that ϕ(λ) is continuous at any interior point of Λ.

We want to show that for λnλ0Λ δ,

λnλ0∣<δ→∣ϕ(λn)ϕ(λ0)∣<ε

for some n>N and ε>0. Notice λnλ0λnXλ0XeλnXeλ0X.

Since λ0Λ, E[eλ0X]<. Thus, supnE[eλnX]<. So, eλnX is uniformly integrable. Thus,

ϕ(λnX)ϕ(λ0X)=∣E[eλnX]E[eλ0X]=∣E[eλnXeλ0X]E(eλnXeλ0X)Jensen's0uniform integrability

Then, for n large enough, E[eλnX]E[eλ0X]∣<ε. So take δ=∣eλn+1Xeλ0X.

3

Let Xn and Yn be random variables independent of each other for all n, and that XnX, YnY (pointwise convergence). Show that X and Y are independent.

Define g(X)=eitX. We know that g(Xn)g(X) and g(Yn)g(Y). Also, eitX=cos(tX)+isin(tX) and eitX∣≤1 t.

So, we can use the DCT. Since g(Xn)g(X) pointwise and g(Xn)1 and E(1)<, then E(g(Xn))E(g(X)). Thus we have the characteristic function of Xn converging to the characteristic function of X. The same holds for Yn and Y.

Then,

ϕ(Xn,Yn)=E(eitXn+iuYn)=E(eitXn)E(eiuYn)=ϕ(Xn)ϕ(Yn)limnϕ(Xn,Yn)=limnϕ(Xn)ϕ(Yn)=ϕ(X)ϕ(Y)=E(eitX)E(eiuY)=E(eitX+iuY)=ϕ(X,Y)

Thus, X and Y are independent.

4

Let X b an integrable random variable on a probability space (Ω,A,P) and An,AA such that P(AnΔA)0 as n. Show that AnXdPAXdP.

Since P(AnΔA)0, we know

P(AnΔA)=P((AnA)(AAn))=P((AnA))+P((AAn))0.

Thus, P(AnA)0 and P(AAn)0. Now we can look at AnXdP.

AnXdP=(AnA)(AnAC)XdP=(A(AAn))(AnA)XdP=AXdPAnAXdP+AAnXdP=E[X1A]E[X1AAn]+E[X1AnA]

Since X is integrable, we know that M such that ε>0, X∣>MXdP<ε. Now we can look at one of the expectations.

E[X1AAn]=E[X1AAnX∣>M]+E[X1AAnX∣≤M]=AAnX∣>MXdP+AAnX∣≤MXdP<X∣>MXdP+MAAndP0

Since, AAndP0. Similarly, E[X1AnA]0. Thus,

AnXdP=AXdPAnAXdP+AAnXdPAXdP0+0=AXdP.

5

Let Xn, Yn, X, Y be random variables on a probability space (Ω,A,P) satisfying 0XnYn for all n, XnX, YnY (pointwise convergence) and E(Yn)E(Y)<. Then show that E(Xn)E(X).

Notice XnYn0 and E(0)=0<. Then by Fatou’s Lemma,

E[lim supXnYn]lim supE[XnYn]E[lim supXn]E[lim supYn]lim supE[Xn]lim supE[Yn]E[X]E[Y]lim supE[Xn]E[Y]E[X]lim supE[Xn].

Notice 0Xn and E[0]=0<. Again, by Fatou’s Lemma,

E[X]=E[lim infXn]lim infE[Xn].

Then,

E[X]lim infE[Xn]lim supE[Xn]E[X].

Thus, E[Xn]E[X].