aboutsummaryrefslogtreecommitdiff
path: root/houdre/hw4.tex
blob: 84dee63bdb651633bdaf8284ca7e62ebb9f415e7 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
\newfam\rsfs
\newfam\bbold
\def\scr#1{{\fam\rsfs #1}}
\def\bb#1{{\fam\bbold #1}}
\let\oldcal\cal
\def\cal#1{{\oldcal #1}}
\font\rsfsten=rsfs10
\font\rsfssev=rsfs7
\font\rsfsfiv=rsfs5
\textfont\rsfs=\rsfsten
\scriptfont\rsfs=\rsfssev
\scriptscriptfont\rsfs=\rsfsfiv
\font\bbten=msbm10
\font\bbsev=msbm7
\font\bbfiv=msbm5
\textfont\bbold=\bbten
\scriptfont\bbold=\bbsev
\scriptscriptfont\bbold=\bbfiv

\def\Pr{\bb P}
\def\E{\bb E}
\newcount\qnum
\def\q{\afterassignment\qq\qnum=}
\def\qq{\qqq{\number\qnum}}
\def\qqq#1{\bigskip\goodbreak\noindent{\bf#1)}\smallskip}
\def\fr#1#2{{#1\over #2}}
\def\var{\mathop{\rm var}\nolimits}

\q5

The generating function for the number of flowers is $$G_N(s) =
\sum_{s=0}^\infty qp^ns^n = {q\over 1 - ps}.$$
The probability generating function for each flower becoming a fruit is
$x = {1+s\over2}$, and from theorem 4.36, $$G_S(s) = G_N(G_X(s)) =
{q\over 1 - p{1+s\over2}} = {2q\over2-p}{1\over 1 - {p\over2-p}s}.$$
This corresponds to $$\Pr(R=r) = {2q\over2-p}\left({p\over2-p}\right)^r.$$

$$\Pr(N=n|R=r) = {\Pr(R=r|N=n)\over\Pr(R=r)}
= {{n\choose r}(1/2)^r \over
{2q\over2-p}\left({p\over2-p}\right)^r} =
{(2-p)^{r+1}\over2q(2p)^r}{n\choose r}.$$

\q6

$\Pr(X=x) = {n\choose x}p^xq^{n-x}.$
Its probability generating function $G_X$ is $$G_X(s) = (q+ps)^n.$$
$$\E(X) = G_X'(1) = pn(q+ps)^{n-1} = pn.$$
$$G_X''(1) = p^2n(n-1)(q+ps)^{n-2} = p^2n(n-1).$$
$$\var(X) = G_X''(1) + G_X'(1) - G_X'(1)^2 = p^2n(n-1) + pn - p^2n^2
= pn - p^2n = qpn$$

Evenness can be determined as $(G_X(-1)+G_X(1))/2$, because $(-1)^s +
1^s = 2$ iff s is even (else 0). These are in turn $G_X(-1) = (q-p)^n =
(1-2p)^n$ and $G_X(1) = 1,$ giving $(1+(1-2p)^n)/2.$

Divisibility by three is similar, but instead of $1$ and $-1$, the cubic
roots of unity are used: $1,$ $e^{i2\pi/3},$ and $e^{i4\pi/3}.$ $G_X$ on
these values will sum to $3$ for any number divisible by three because
these numbers cubed is $1$ and sum to $0$ for other numbers.
$(G_X(1) + G_X(e^{i2\pi/3}) + G_X(e^{i4\pi/3}))/3
= (1 + (q+pe^{i2\pi/3})^n + (q+pe^{i4\pi/3})^n).%
%(\cos(2\pi/3) + i\sin(2\pi/3)))^n + (q+p(\cos(2\pi
$
% yeah ig you have to do the simplification

\q8

For some value of $N$, $X$ and $Y$ follow binomial distributions.
They are independent,

Because $X$ and $Y$ are independent and $N=X+Y$, $$G_N(s) =
G_X(s)G_Y(s).$$ For a given value of $N$, $X$ and $Y$ both follow
binomial distributions, meaning, for a given value of $N=n$, $$G_X(s) =
G_Y(s) = (\fr12[1+s])^n.$$ Conditioning on $N$ gives the more valid
expressions $$G_X(s) = G_Y(s) = \sum_{i=0}^\infty
\Pr(N=i)(\fr12[1+s])^i = G_N(\fr12[1+s]).$$
Combining this with the first expression gives $$G_N(s) =
G_N(\fr12[1+s])^2.$$ Let $H_N(s) = G_N(1-s)\to G_N(s) = H_N(1-s).$ This
gives an identity $$H_N(1-s) = H_N(1-\fr12[1+s])^2 =
H_N(\fr12[1-s])^2 \to H_N(s) = H_N(\fr12s)^2.$$
There is only one function that fits this description: the exponential
curve $H_N(s) = e^{-\lambda s} \to G_N(s) = e^{-\lambda(1-s)} =
e^{\lambda(s-1)},$ corresponding uniquely to $\sum_{k=0}{1\over
k!}\lambda^ke^{-\lambda}s^k.$ This is a Poisson distribution.

\q9

There is always a $p=1/3$ chance of finding a red token in each
collection. The probability of a string of $j$ collections not having a
red token in the first $j-1$ and a red token in the final collection is
$p(1-p)^{j-1},$ so the generating function is $$\sum_{j=0}^\infty
p(1-p)^{j-1}s^j = ps\sum_{j=0}^\infty (s(1-p))^{j-1} =
{ps\over1-s(1-p)}.$$

In the general case,
let $G_{Y_n}$ be the time to acquire $n$ coupons. $G_{Y_1} = s$ because
it always takes only one collection to acquire a unique coupon. After
$n$ coupons have been collected, there is a $p = (m-n)/m$ chance of
obtaining a new coupon, so the probability of ``time to next coupon''
being $k$ is $pq^{k-1},$ giving a generating function $ps\over1-qs$
after any given previous value. Convolving with $G_{Y_n}$ gives
$$G_{Y_{n+1}}(s) = G_{Y_n}(s){{m-n\over m}s\over1-(n/m)s}
\Longrightarrow G_{Y_n}(s) = {s^n\over m^n}{(m-1)(m-2)\cdots(m-n) \over
(m-s)(m-2s)(m-3s)\cdots(m-ns)/m^n}.$$
$$\Longrightarrow G_Y(s) = {s^m(m-1)!\over(m-s)(m-2s)\cdots(m-ms)}.$$
$$\Longrightarrow \E(Y) = G_Y'(1) = m(m-1!)\cdot
    {{1\over 1}+{1\over2}+\cdots+{1\over m}\over(m-1)!}.$$

This simplifies to the given form.

\q10

The mean value of a discrete random variable $X$ is $\E(X) =
\sum_{i\in X} i\Pr(X=i).$ For a nonnegative integer-valued random
variable, its generating function is $\phi(s) = \sum_{i=0}^\infty
\Pr(X=i)s^i.$

$\phi'(1) = \sum_{i=0}^\infty i\Pr(X=i) = \E(X),$ so $\phi(s) =
p(s)/q(s)$ has mean value $$\phi'(1) = {p'(1)\over q(1)} -
{p(1)q'(1)\over q(1)^2} = {p'(1)\over q(1)} - \phi(1){q'(1)\over q(1)} =
{p'(1)-q'(1)\over q(1)}$$
because $\phi(1) = 1$ because it is a probability generating function.

Duelist A wins the duel on their nth shot with probability $a$ for
making the nth shot and $(1-a)^{n-1}(1-b)^{n-1}$ for it not already
having been made. Let $r = (1-a)(1-b).$
$$\sum_{i=0}^\infty ar^i = {a\over 1-r} = {a\over a+b-ab}$$
This is the probability duelist A wins.

The probability distribution of the number of shots fired is
well-described by a probability generating function
$\phi(s) = {a\over 1-rs^2} + {b(1-a)s\over 1-rs^2} = {a+b(1-a)s\over
1-rs^2}.$ These represent, respectively, the series of chances A and B
have to win the duel, so $$\E(\hbox{shots fired}) = \phi'(1) = {b(1-a) +
2r\over 1-r}$$
by the earlier established identity with $p=a+bs(1-a)$ and $q=1-rs^2.$
This simplifies to $(2-b)(1-a)\over a+b-ab.$

\q11

With $N=n,$
$$G_F(s) = \sum_{i=0}^n{n\choose i}p^i(1-p)^{n-i}s^i = (ps+1-p)^n.$$
Conditioning on $N$ gives
$$G_F(s) = \Pr(N=0) + \Pr(N=1)(ps+1-p) + \Pr(N=2)(ps+1-p)^2 +\cdots
= G_N(ps+1-p).$$

(b) is true by 3.6.14.

(c) is a case of 4.5.8, where from $p=1/2$ (fair coin) and independence
of two variables summing to a larger variable $N$ proves that the larger
variable $N$ is a Poisson distribution.

\bye