\newfam\rsfs
\newfam\bbold
\def\scr#1{{\fam\rsfs #1}}
\def\bb#1{{\fam\bbold #1}}
\let\oldcal\cal
\def\cal#1{{\oldcal #1}}
\font\rsfsten=rsfs10
\font\rsfssev=rsfs7
\font\rsfsfiv=rsfs5
\textfont\rsfs=\rsfsten
\scriptfont\rsfs=\rsfssev
\scriptscriptfont\rsfs=\rsfsfiv
\font\bbten=msbm10
\font\bbsev=msbm7
\font\bbfiv=msbm5
\textfont\bbold=\bbten
\scriptfont\bbold=\bbsev
\scriptscriptfont\bbold=\bbfiv

\def\Pr{\bb P}
\def\E{\bb E}
\newcount\qnum
\def\q{\afterassignment\qq\qnum=}
\def\qq{\qqq{\number\qnum}}
\def\qqq#1{\bigskip\goodbreak\noindent{\bf#1)}\smallskip}
\def\fr#1#2{{#1\over #2}}

\q2

$U$ and $V$ are pairwise independent, and $$\Pr(W=1) = \Pr(U=1)\Pr(V=1)
+ \Pr(U=-1)\Pr(V=-1)$$$$= ab + (1-a)(1-b) = ab + 1 -a -b + ab.$$
If $W$ is pairwise independent with $U$, $$\Pr(W) = \Pr(W=1|U=1) =
\Pr(V=1) = b = ab + 1 -a -b + ab$$$$\Longrightarrow 2b - 2ab = 1 -a =
2b(1-a)$$$$\Longrightarrow b = \fr12.$$

By a similar derivation, $a = \fr12.$ The three variables ${U,V,W}$ are
not independent. If they were, $\Pr(W=1|U=1) = b = \Pr(W=1|U=1|V=1) =
1.$ $b\neq1$, so the variables are not independent.

\q3

Both $X$ and $Y$ can be transformed into Bernoulli trials.  This
property is both necessary and sufficient for independence of a pair of
Bernoulli trials.
Necessity is shown by theorem 3.19, and it is sufficient because
$$\E(XY) = \Pr(X=1)\Pr(Y=1|X=1) = \Pr(X=1)\Pr(Y=1) = \E(X)\E(Y) \to
\Pr(Y=1) = \Pr(Y=1|X=1),$$
so this holds for $X$ and $Y$.

\q5

$$\Pr(Z=\min(X,Y)>z) = \Pr(X>z\cap Y>z) = \Pr(X>z)\Pr(Y>z),$$
by independence.

$$\Pr(X>z) = (1-p)^z \Longrightarrow \Pr(X>z)\Pr(Y>z) = (1-p)^z(1-r)^z =
(1-p-r+pr)^z.$$

$$\Pr(Z=z) = \Pr(Z>z-1) - \Pr(Z>z) = (1-p-r+pr)^{z-1} - (1-p-r+pr)^z =
(p+r-pr)(1-p-r+pr)^{z-1},$$
so $Z$ is a random variable following a geometric distribution with
parameter $p+r-pr.$

\q7

$$\E(X_1 + X_2 +\cdots+ X_N) = \sum_{n\in N} \Pr(N=n)\E(\sum_{i=1}^n
X_i).$$$$ \E(\sum_{i=1}^n X_i) = \sum_{\omega\in\Omega}\sum_{i=1}^n
\Pr(\omega)X_i(\omega) = \sum_{i=1}^n\sum_{\omega\in\Omega}
\Pr(\omega)X_i(\omega) = \sum_{i=1}^n\E(X_i) = n\mu.$$$$\Rightarrow
\E(X_1+\cdots+X_N) = \sum_{n\in N} \Pr(N=n)\mu = \mu\E(N).$$

\q12

When $i$ coupons have been collected out of $c$, the probability of
collecting a new coupon is $(c-i)/c$, so the probability of collecting a
new coupon on the $n$th trial (and not having collected one before) is
$((c-i)/c)*(1 - (c-i)/c)^{n-1}$. This is the geometric distribution on
parameter $(c-i)/c$. The expected value of a geometric distribution is
$1/p$ where $p$ is the parameter because
$$\E(X) = \sum_{i=1}^\infty i\Pr(X=i) = \sum_{i=1}^\infty \sum_{j=1}^i
pq^{i-1} = \sum_{j=1}^\infty \sum_{i=j}^\infty pq^{i-1}$$$$ =
\sum_{j=1}^\infty q^{j-1}\sum_{i=1}^\infty pq^{i-1} = \sum_{j=1}^\infty
q^{j-1} = (1/p)\sum_{j=1}^\infty pq^{j-1} = 1/p.$$

This means that the expected number of coupons collected before we have
a complete set is $\sum_{i=1}^c c/(c-i).$

% seriously needs to be proofread and checked for fit on the page

\q13

Suppose a string of $n$ coupons has $i$ unique coupons. This means $n-i$
duplicate coupons have been collected, each of which has a probability
$i/c$ of being collected instead of a new/unique coupon. The probability
of a unique coupon being chosen is shown to be $(c-k)/c$ in the question
above, which means there must be $i$ coupon collections of probabilities
$c/c, (c-1)/c, (c-2)/c, \ldots, (c-i-1)/c$. The product of these two is
$$\left({i\over c}\right)^{n-i}{c!\over(c-i)!c^i} =
{i^{n-i}c!\over(c-i)!c^n}.$$

$$\E(n) = \sum_{i=1}^n i{i^{n-i}c!\over(c-i)!c^n}$$

\q14

$X$ and $Y$ are binomial distributions for a fixed value of $N$.
With $p_N = \lambda^m\fr1{m!}e^{-\lambda},$ $p_X = {N\choose
k}p^kq^{N-k}$. Conditioning on $N,$
$$\Pr(X=k) = \sum_{i=1}^\infty \Pr(N=i){i\choose k}p^kq^{i-k} =
\sum_{i=1}^\infty \lambda^i\fr1{i!}e^{-\lambda}{i\choose k}p^kq^{i-k}
$$$$ =
{1\over k!}p^k\sum_{i=1}^\infty{e^{-\lambda}\lambda^iq^{i-k}\over(i-k)!}
= {1\over k!}p^k\lambda^ke^q
    \sum_{i=1}^\infty{e^{-q\lambda}(q\lambda)^{i-k}\over(i-k)!}
= {(p\lambda)^ke^q\over k!}
.$$
By a similar calculation, $\Pr(Y=k) = {(q\lambda)^ke^p\over k!}.$

To prove independence, $\Pr(X=x) = \Pr(X=x|Y=y) = \Pr(X=x|N=x+y)$, which
are $$\Pr(X=x) = {(p\lambda)^xe^q\over x!} = {{x+y\choose x}p^xq^y
\over \lambda^{x+y}e^{-\lambda}\fr1{(x+y)!}} = \Pr(X=x|N=x+y).$$
$$\lambda^{2x+y}e^q = {(x+y)!q^y(x+y)!
\over e^{-\lambda}y!}.$$

\qqq{3.37}
\def\var{\mathop{\rm var}\nolimits}
Let $A_i$ be the $i$th king/queen pair sitting next to eachother:
$$N = \sum_{i=1}^n 1_{A_i}.$$
This means
$$\E(N) = \E(\sum_{i=1}^n \E(1_{A_i}) = \sum_{i=1}^n\E(1_{A_i})
= \sum_{i=1}^n\Pr(A_i) = n\Pr(A_1),$$
by symmetry.
$\Pr(A_1) = 2/n$ because there are $n$ seats available to any given king
and $2$ possible seats to sit next to his queen, and $\E(N) = n(2/n)=2.$
$\var(N) = \E(N^2) - \E(N)^2,$ so we need $\E(N^2).$
$$\E(N^2) = \E\left(\left[\sum_i 1_{A_i}\right]^2\right)
= \E\left(\sum_i (1_{A_i}^2 = 1_{A_i}) + \sum_{i\neq
j}(1_{A_i}1_{A_j} = 1_{A_i\cap A_j})\right).$$

By symmetry, this is equal to
$$n\Pr(A_1) + n(n-1)\Pr(A_1\cap A_2),$$
and the second can be solved with conditional probability:
$$\Pr(A_1\cap A_2) = \Pr(A_1)\Pr(A_2|A_1) =
\fr2n\left(\fr1{n-1}\cdot\fr1{n-1} + \fr{n-2}{n-1}\cdot\fr2{n-1}\right)
= {2(2n-3)\over n(n-1)^2}.$$
The first term corresponds to the second queen sitting next to the first
couple and the second term is that not happening.

This leaves
$$\var(N) = \E(N^2) - \E(N)^2 = \left(2 + {2(2n-3)\over n-1}\right) -
4 = {2n-4\over n-1}.$$

\bye