aboutsummaryrefslogtreecommitdiff
path: root/houdre/hw7.tex
blob: 70206693a570c5345a0d54d9080bc08161e0c6ab (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
\newfam\rsfs
\newfam\bbold
\def\scr#1{{\fam\rsfs #1}}
\def\bb#1{{\fam\bbold #1}}
\let\oldcal\cal
\def\cal#1{{\oldcal #1}}
\font\rsfsten=rsfs10
\font\rsfssev=rsfs7
\font\rsfsfiv=rsfs5
\textfont\rsfs=\rsfsten
\scriptfont\rsfs=\rsfssev
\scriptscriptfont\rsfs=\rsfsfiv
\font\bbten=msbm10
\font\bbsev=msbm7
\font\bbfiv=msbm5
\textfont\bbold=\bbten
\scriptfont\bbold=\bbsev
\scriptscriptfont\bbold=\bbfiv

\def\Pr{\bb P}
\def\E{\bb E}
\newcount\qnum
\def\q{\afterassignment\qq\qnum=}
\def\qq{\qqq{\number\qnum}}
\def\qqq#1{\bigskip\goodbreak\noindent{\bf#1)}\smallskip}
\def\align#1{\vcenter{\halign{$\displaystyle##\hfil$\tabskip1em&&
    $\hfil\displaystyle##$\cr#1}}}
\def\fr#1#2{{#1\over #2}}
\def\var{\mathop{\rm var}\nolimits}
\def\cov{\mathop{\rm cov}\nolimits}
\def\infint{\int_{-\infty}^\infty}
\def\pa#1#2{\partial#1/\partial#2}

\q2

$$\E\left(\fr1{n-1}\sum_{i=1}^n(X_i-\overline X)^2\right)
= \E\left(\fr1{n-1}\sum_{i=1}^n(X_i-\mu - (\overline X -
\mu))^2\right)$$$$
= \E\left(\fr1{n-1}\sum_{i=1}^n((X_i-\mu)^2-2(X_i-\mu)(\overline X-\mu)
+ (\overline X-\mu)^2)\right)$$$$
= \E\left(\fr1{n-1}\right)
= \E\left(\fr1{n-1}(\sum_{i=1}^n(X_i-\mu)^2-n(\overline X-\mu)^2)\right)
= \fr1{n-1}(n\sigma^2 - n\E(\overline X-\mu)^2)$$$$
= \fr1{n-1}(n\sigma^2 - n\E(({1\over n}(X_1-\mu)+\cdots+(X_n-\mu))^2)
= \fr1{n-1}(n\sigma^2 - n\E(({1\over n}(X_1-\mu)+\cdots+(X_n-\mu))^2)
$$$$
= \fr1{n-1}(n\sigma^2 - {1\over n}
(\E((X_1-\mu)^2)+\cdots+\E((X_n-\mu)^2) + \E((X_1-\mu)(X_2-\mu))+\cdots)
= \fr1{n-1}(n\sigma^2 - \sigma^2) = \sigma^2.$$
Note $\E((X_i-\mu)(X_j-\mu)) = \E(X_i-\mu)\E(X_j-\mu) = 0,$ when $i\neq
j,$ by independence.

\q3

$1 = \E(S_n/S_n) = n\E(X_1/S_n) \to \E(X_1/S_n) = 1/n,$ using linearity
and symmetry.
$\E(S_m/S_n) = m\E(X_1/S_n) = m/n.$
However, it is not necessary that $\E(X_k/S_n) \neq \E(X_1/S_n)$ for
$k>n,$ so the equality doesn't necessarily hold for $m>n.$

\q8

The mgf is the product of the sum's components' mgf, or in other words
$M_{X_1}(x)^n$ for $N=n.$
$M_U(ln(s)) = \E(e^{\ln(s)X}) = \E(s^X) = G_U(s),$ and the general mgf
is $G_N(M_{X_1}(x)) = M_N(\ln(M_{X_1}(x))).$

\q10

Expanding $\cov(X,Y) = \E(XY)-\E(X)\E(Y)$ gives
$$\cov(X,Y) = \E((X_1+\cdots+X_n)(Y_i+\cdots+Y_n))-\E(X_1+\cdots+X_n)
\E(Y_1+\cdots+Y_n),$$
becomes, by linearity,
$$\cov(X,Y) = \E(X_1Y_1)+\cdots+\E(X_1Y_n)\cdots+\E(X_nY_n) - (\E X_1 +
\cdots + \E X_n)(\E Y_1 + \cdots + \E Y_n)$$
$$= \E(X_1Y_1) + \E(X_2Y_2) + \cdots + \E(X_nY_n),$$
by independence for $X_i,Y_j:i\neq j.$

By the earlier theorem, since each game is independent and identical,
$\cov(X,Y) = n\cov(X_1,Y_1).$
$$\cov(X_1,Y_1) = \E(XY) - \E(X)\E(Y) = 2pq - (2p^2 + 2pq)(2q^2 + 2pq) =
2pq - 4p^2q^2 - 4p^3q - 4p^2q^2 - 4pq^3$$$$= 2pq - 4pq(2pq+p^2+q^2) = -2pq.$$
So $\cov(X,Y) = -2npq.$

\q14

The coupon collecting time to all coupons is the sum of geometric
distributions with parameters $p=c/c,(c-1)/c\ldots1/c.$
The moment generating function of a geometric distribution is
$$\sum_x^\infty e^{tx}\Pr(X=x) = \sum_x^\infty e^{tx}pq^x =
\sum_x^\infty p(e^tq)^x = {p\over 1-e^tq}.$$

The moment generating function is therefore the product of this series
of geometric moment generating functions:
$${c!\over c^c}{1\over(1-e^t)(1-2e^t)\cdots(1-ce^t)}$$

\q17

$$f(x) = {1\over2}e^{-|x|} = {1\over2\pi}\infint e^{-itx}\phi(t)dt =
{1\over2\pi}\infint e^{-itx}{1\over1+t^2}dt =
{1\over2\pi}\infint -e^{itx}{1\over1+t^2}dt =
{1\over2}e^{-|x|},$$
based on 7.80, using $\phi(t) = {1\over1+t^2}.$

\q18

The characteristic function of the Cauchy distribution is $e^{-|t|}.$
$$M_{A_n}(t) = \E(e^{itn^{-1}(X_1+\cdots+X_n)}) =
\root n\of{M_{X_1}(t)^n} = M_{X_1}(t).$$

\q20

%deeply unsure

By applying the partition theorem to $M(t) = \E(e^{tX}),$ we trivially
get $$M(t) = \sum_{k=1}^\infty \Pr(N=k)\E(e^{tX}|N=k).$$

Given that $\Pr(N=k) = {1\over(e-1)k!}$ and 
Given these specific distributions,
$$\Pr(X\leq x) = \Pr(U_1\leq x)\cdots\Pr(U_k\leq x) = x^n \to
f_X(x) = nx^{n-1}$$$$ \to \E(X^j) = \int_0^1 x^jnx^{n-1}dx = \int_0^1
nx^{n+j}/(j+n)\to \E(e^{tX}|N=k) = \sum_{j=0}^\infty {1\over
j!}t^jnx^{n+j}/(j+n)$$$$
\to M(t) = \sum_{k=1}^\infty {1\over(e-1)k!}\sum_{j=0}^\infty {1\over
j!}t^jnx^{n+j}/(j+n).$$

using $\E(e^{tX}) = \sum_{n=0}^\infty {t^j\over j!}\E(X^j).$

The difference of $R$ and $X$ must be exponentially distributed because
the quotient of their moment generating functions is similar to an
exponential moment generating function.

\bye