aboutsummaryrefslogtreecommitdiff
path: root/houdre
diff options
context:
space:
mode:
Diffstat (limited to 'houdre')
-rw-r--r--houdre/hw2.tex142
1 files changed, 142 insertions, 0 deletions
diff --git a/houdre/hw2.tex b/houdre/hw2.tex
new file mode 100644
index 0000000..7bd11b4
--- /dev/null
+++ b/houdre/hw2.tex
@@ -0,0 +1,142 @@
+\newfam\rsfs
+\newfam\bbold
+\def\scr#1{{\fam\rsfs #1}}
+\def\bb#1{{\fam\bbold #1}}
+\let\oldcal\cal
+\def\cal#1{{\oldcal #1}}
+\font\rsfsten=rsfs10
+\font\rsfssev=rsfs7
+\font\rsfsfiv=rsfs5
+\textfont\rsfs=\rsfsten
+\scriptfont\rsfs=\rsfssev
+\scriptscriptfont\rsfs=\rsfsfiv
+\font\bbten=msbm10
+\font\bbsev=msbm7
+\font\bbfiv=msbm5
+\textfont\bbold=\bbten
+\scriptfont\bbold=\bbsev
+\scriptscriptfont\bbold=\bbfiv
+
+\def\Pr{\bb P}
+\def\E{\bb E}
+\newcount\qnum
+\def\q{\afterassignment\qq\qnum=}
+\def\qq{\bigskip\goodbreak\noindent{\bf \number\qnum)}\smallskip}
+\def\infsum#1#2{\sum_{#1=#2}^\infty}
+\def\fr#1#2{{#1\over #2}}
+
+\q1
+
+$\E(X(X-1)\cdots (X-k)) = \E(g(X))$ where $g(x) = {x!\over(x-(k+1))!}.$
+
+This gives an expected value of $$\infsum x0 g(x){\lambda^x\over x!}
+= \infsum x{k+1} {\lambda^x\over (x-(k+1))!} =
+\lambda^{k+1}\infsum x{k+1} {\lambda^{x-(k+1)}\over (x-(k+1))!}
+= \lambda^{k+1}\infsum x0 {\lambda^x\over x!}
+= \lambda^{k+1}.$$
+The second transformation is possible because $g(x)$ is zero for
+$x\in{0,1,\ldots,k}$.
+This last sum is equal to one because the Poisson distribution is a
+probability mass function, and this is a Poisson distribution.
+
+\q2
+
+If a head is flipped (with probability $p$), only $r-1$ heads remain to
+flip, so the mean time to end is $m(r-1).$
+If a tail is flipped (with probability $q=1-p$), the mean remaining
+number of flips remains the same at $m(r)$.
+However, this adds one to the count either way, so $$m(r) = 1+p(m(r-1))
++ q(m(r)).$$
+Turning this into a difference equation gives $$pm_{r+1} - pm_r = 1.$$
+The complementary equation gives a solution of $c_1$ because $\theta -
+1$ has only the root of $1$, so $x_n = c_1\cdot 1$ (but, since the
+series starts at 0 tosses for 0 heads, $c_1=0$).
+The particular equation can be found to be $r/p$, so the general
+equation is $r/p.$ This is the mean number of tosses.
+
+\q4
+
+$\infsum k1 k^\alpha$ converges if $\alpha < -1.$ With
+$$c\infsum k1 k^\alpha = 1\iff c = {1\over \infsum k1 k^\alpha},$$
+this function is a probability mass function because it will by
+definition have the sum over its domain be one.
+
+\q5
+
+The geometric distribution is $pq^{k-1}$.
+$$\Pr(X > m+n | X > m) = {\infsum x{m+n} pq^{x-1}\over \infsum xm
+pq^{x-1}} = {q^{m+n}\over q^m} = q^n = \Pr(X > n).$$
+These sums converge to these values because $\infsum x0 pq^{x-1} = 1,$
+and $q^k$ can be factored out if the sum doesn't start at zero.
+
+From this definition, $\Pr(X > m+n | X > m) = \Pr(X > n),$ the
+geometric distribution can be derived, so because iff this is true, the
+distribution is geometric, the geometric is the only distribution for
+which this is true.
+$${\Pr(X > m+n\cap X > m)\over \Pr(X > m)} = \Pr(X > n)
+\to \Pr(X > n)\Pr(X > m) = \Pr(X > m+n)$$
+$$\to \Pr(X = n)\Pr(X = m) = \Pr(X = m+n)
+\to \Pr(X = n)\Pr(X = 1) = \Pr(X = n+1)
+\to a_{n+1} = qa_n.$$
+The only probability mass distribution that follows this form is the
+geometric distribution.
+
+\q6
+
+$$\infsum k0 \Pr(N > k) = \infsum k1 \infsum jk \Pr(N = j)
+= \infsum j1 \sum_{k=1}^j \Pr(N=k) = \infsum j1 j\Pr(N=j).$$
+
+Because this last expression is the definition of $\E(n)$, this is true.
+
+Throwing this fair die gives $3^n$ possible patterns which have been
+thrown. Of two items, the number of possible patterns is $2^n$. Of one
+item, the number of possible patterns is $1^n$. Using inclusion-%
+exclusion, there are ${3\choose 2} = 3$ pairs of sides, so there are
+$3*2^n$ patterns without all colors. But this double-counts one single
+side showing up twice in two pairs (both RB and GB consider BBBBB to be
+a possible pattern), so the real number is $3*(2^n-1).$ This gives a
+probability of $\Pr = {2^n - 1\over 3^{n-1}}.$
+
+Using the formula $n\sum_{k=1}^n \fr1k$ to be proved in question seven,
+the expected value is $11\over2$ because $3*(1+\fr12+\fr13) =
+3*{11\over6} = 11\over2$.
+
+\q7
+
+Let the coupon collector be attempting to collect $n$ coupons. The mean
+time to 1 coupon is one collection. After $k$ coupons have been
+collected, the probability of choosing a new coupon is ${n-k\over n}.$
+This gives a mean time to the next new coupon which can be calculated by
+a partition. $m(k+1) = (m(k)+1){n-k\over n} + (m(k+1)+1){k\over n} \to
+{n-k\over n}m(k+1) = m(k){n-k\over n} + 1 \to m(k+1) = \fr n{n-k}+m(k).$
+$m(n) = \sum_{k=1}^n \fr n{n-k} = \sum_{k=1}^n \fr nk = n\sum_{k=1}^n
+\fr1k.$
+
+The probability that the first $n$ coupons do not form a complete set of
+$k$ coupons is calculable as $$\sum_{j=1}^{k-1} (-1)^{k-j}{k\choose k-j}
+{(k-j)^n\over k^n}.$$
+This is accurate because this corresponds to the probability of the
+union (calculated using inclusion-exclusion) of all subsets of $k-1$
+possible coupons (the largest possible group that doesn't form a
+complete set).
+
+\q9
+
+Zero heads in a row is achieved immediately---zero tosses, so the mean
+number of tosses is zero: $m(0) = 0.$
+Given that flipping $n$ heads in a row happens after mean tosses $m(n)$,
+$n+1$ heads in a row has probability
+$m(n+1) = p(m(n)+1) + q(m(n)+m(n+1)+1).$
+The first case is if one more head is flipped---this has a probability
+$p$ of happening and requires, on average, the mean tosses of $n$ heads
+in a row plus one for the head.
+The second case is if one more tail is flipped---this has a probability
+of $q=1-p$ and requires 1 more flip for the tail and $m(n+1)$ more
+expected flips after the failure, which will come, on average after
+$m(n)$ flips.
+
+Solving this as a difference equation gives $(1-q = p)m_{n+1} - m_n -
+1 = 0.$ Let $k = 1/p.$ A particular solution is $m_n = {k(k^n-1)\over
+k-1}.$ A complementary solution is $c_1k^{n-1},$ so a general solution
+is $m_n = c_1k^{n-1} + {k(k^n-1)\over k-1}.$
+\bye