diff options
-rw-r--r-- | houdre/hw4.tex | 111 | ||||
-rw-r--r-- | houdre/hw5.tex | 111 |
2 files changed, 222 insertions, 0 deletions
diff --git a/houdre/hw4.tex b/houdre/hw4.tex index d4ffebb..84dee63 100644 --- a/houdre/hw4.tex +++ b/houdre/hw4.tex @@ -24,19 +24,130 @@ \def\qq{\qqq{\number\qnum}} \def\qqq#1{\bigskip\goodbreak\noindent{\bf#1)}\smallskip} \def\fr#1#2{{#1\over #2}} +\def\var{\mathop{\rm var}\nolimits} \q5 +The generating function for the number of flowers is $$G_N(s) = +\sum_{s=0}^\infty qp^ns^n = {q\over 1 - ps}.$$ +The probability generating function for each flower becoming a fruit is +$x = {1+s\over2}$, and from theorem 4.36, $$G_S(s) = G_N(G_X(s)) = +{q\over 1 - p{1+s\over2}} = {2q\over2-p}{1\over 1 - {p\over2-p}s}.$$ +This corresponds to $$\Pr(R=r) = {2q\over2-p}\left({p\over2-p}\right)^r.$$ +$$\Pr(N=n|R=r) = {\Pr(R=r|N=n)\over\Pr(R=r)} += {{n\choose r}(1/2)^r \over +{2q\over2-p}\left({p\over2-p}\right)^r} = +{(2-p)^{r+1}\over2q(2p)^r}{n\choose r}.$$ \q6 +$\Pr(X=x) = {n\choose x}p^xq^{n-x}.$ +Its probability generating function $G_X$ is $$G_X(s) = (q+ps)^n.$$ +$$\E(X) = G_X'(1) = pn(q+ps)^{n-1} = pn.$$ +$$G_X''(1) = p^2n(n-1)(q+ps)^{n-2} = p^2n(n-1).$$ +$$\var(X) = G_X''(1) + G_X'(1) - G_X'(1)^2 = p^2n(n-1) + pn - p^2n^2 += pn - p^2n = qpn$$ + +Evenness can be determined as $(G_X(-1)+G_X(1))/2$, because $(-1)^s + +1^s = 2$ iff s is even (else 0). These are in turn $G_X(-1) = (q-p)^n = +(1-2p)^n$ and $G_X(1) = 1,$ giving $(1+(1-2p)^n)/2.$ + +Divisibility by three is similar, but instead of $1$ and $-1$, the cubic +roots of unity are used: $1,$ $e^{i2\pi/3},$ and $e^{i4\pi/3}.$ $G_X$ on +these values will sum to $3$ for any number divisible by three because +these numbers cubed is $1$ and sum to $0$ for other numbers. +$(G_X(1) + G_X(e^{i2\pi/3}) + G_X(e^{i4\pi/3}))/3 += (1 + (q+pe^{i2\pi/3})^n + (q+pe^{i4\pi/3})^n).% +%(\cos(2\pi/3) + i\sin(2\pi/3)))^n + (q+p(\cos(2\pi +$ +% yeah ig you have to do the simplification + \q8 +For some value of $N$, $X$ and $Y$ follow binomial distributions. +They are independent, + +Because $X$ and $Y$ are independent and $N=X+Y$, $$G_N(s) = +G_X(s)G_Y(s).$$ For a given value of $N$, $X$ and $Y$ both follow +binomial distributions, meaning, for a given value of $N=n$, $$G_X(s) = +G_Y(s) = (\fr12[1+s])^n.$$ Conditioning on $N$ gives the more valid +expressions $$G_X(s) = G_Y(s) = \sum_{i=0}^\infty +\Pr(N=i)(\fr12[1+s])^i = G_N(\fr12[1+s]).$$ +Combining this with the first expression gives $$G_N(s) = +G_N(\fr12[1+s])^2.$$ Let $H_N(s) = G_N(1-s)\to G_N(s) = H_N(1-s).$ This +gives an identity $$H_N(1-s) = H_N(1-\fr12[1+s])^2 = +H_N(\fr12[1-s])^2 \to H_N(s) = H_N(\fr12s)^2.$$ +There is only one function that fits this description: the exponential +curve $H_N(s) = e^{-\lambda s} \to G_N(s) = e^{-\lambda(1-s)} = +e^{\lambda(s-1)},$ corresponding uniquely to $\sum_{k=0}{1\over +k!}\lambda^ke^{-\lambda}s^k.$ This is a Poisson distribution. + \q9 +There is always a $p=1/3$ chance of finding a red token in each +collection. The probability of a string of $j$ collections not having a +red token in the first $j-1$ and a red token in the final collection is +$p(1-p)^{j-1},$ so the generating function is $$\sum_{j=0}^\infty +p(1-p)^{j-1}s^j = ps\sum_{j=0}^\infty (s(1-p))^{j-1} = +{ps\over1-s(1-p)}.$$ + +In the general case, +let $G_{Y_n}$ be the time to acquire $n$ coupons. $G_{Y_1} = s$ because +it always takes only one collection to acquire a unique coupon. After +$n$ coupons have been collected, there is a $p = (m-n)/m$ chance of +obtaining a new coupon, so the probability of ``time to next coupon'' +being $k$ is $pq^{k-1},$ giving a generating function $ps\over1-qs$ +after any given previous value. Convolving with $G_{Y_n}$ gives +$$G_{Y_{n+1}}(s) = G_{Y_n}(s){{m-n\over m}s\over1-(n/m)s} +\Longrightarrow G_{Y_n}(s) = {s^n\over m^n}{(m-1)(m-2)\cdots(m-n) \over +(m-s)(m-2s)(m-3s)\cdots(m-ns)/m^n}.$$ +$$\Longrightarrow G_Y(s) = {s^m(m-1)!\over(m-s)(m-2s)\cdots(m-ms)}.$$ +$$\Longrightarrow \E(Y) = G_Y'(1) = m(m-1!)\cdot + {{1\over 1}+{1\over2}+\cdots+{1\over m}\over(m-1)!}.$$ + +This simplifies to the given form. + \q10 +The mean value of a discrete random variable $X$ is $\E(X) = +\sum_{i\in X} i\Pr(X=i).$ For a nonnegative integer-valued random +variable, its generating function is $\phi(s) = \sum_{i=0}^\infty +\Pr(X=i)s^i.$ + +$\phi'(1) = \sum_{i=0}^\infty i\Pr(X=i) = \E(X),$ so $\phi(s) = +p(s)/q(s)$ has mean value $$\phi'(1) = {p'(1)\over q(1)} - +{p(1)q'(1)\over q(1)^2} = {p'(1)\over q(1)} - \phi(1){q'(1)\over q(1)} = +{p'(1)-q'(1)\over q(1)}$$ +because $\phi(1) = 1$ because it is a probability generating function. + +Duelist A wins the duel on their nth shot with probability $a$ for +making the nth shot and $(1-a)^{n-1}(1-b)^{n-1}$ for it not already +having been made. Let $r = (1-a)(1-b).$ +$$\sum_{i=0}^\infty ar^i = {a\over 1-r} = {a\over a+b-ab}$$ +This is the probability duelist A wins. + +The probability distribution of the number of shots fired is +well-described by a probability generating function +$\phi(s) = {a\over 1-rs^2} + {b(1-a)s\over 1-rs^2} = {a+b(1-a)s\over +1-rs^2}.$ These represent, respectively, the series of chances A and B +have to win the duel, so $$\E(\hbox{shots fired}) = \phi'(1) = {b(1-a) + +2r\over 1-r}$$ +by the earlier established identity with $p=a+bs(1-a)$ and $q=1-rs^2.$ +This simplifies to $(2-b)(1-a)\over a+b-ab.$ + \q11 +With $N=n,$ +$$G_F(s) = \sum_{i=0}^n{n\choose i}p^i(1-p)^{n-i}s^i = (ps+1-p)^n.$$ +Conditioning on $N$ gives +$$G_F(s) = \Pr(N=0) + \Pr(N=1)(ps+1-p) + \Pr(N=2)(ps+1-p)^2 +\cdots += G_N(ps+1-p).$$ + +(b) is true by 3.6.14. + +(c) is a case of 4.5.8, where from $p=1/2$ (fair coin) and independence +of two variables summing to a larger variable $N$ proves that the larger +variable $N$ is a Poisson distribution. + \bye diff --git a/houdre/hw5.tex b/houdre/hw5.tex new file mode 100644 index 0000000..a952e76 --- /dev/null +++ b/houdre/hw5.tex @@ -0,0 +1,111 @@ +\newfam\rsfs +\newfam\bbold +\def\scr#1{{\fam\rsfs #1}} +\def\bb#1{{\fam\bbold #1}} +\let\oldcal\cal +\def\cal#1{{\oldcal #1}} +\font\rsfsten=rsfs10 +\font\rsfssev=rsfs7 +\font\rsfsfiv=rsfs5 +\textfont\rsfs=\rsfsten +\scriptfont\rsfs=\rsfssev +\scriptscriptfont\rsfs=\rsfsfiv +\font\bbten=msbm10 +\font\bbsev=msbm7 +\font\bbfiv=msbm5 +\textfont\bbold=\bbten +\scriptfont\bbold=\bbsev +\scriptscriptfont\bbold=\bbfiv + +\def\Pr{\bb P} +\def\E{\bb E} +\newcount\qnum +\def\q{\afterassignment\qq\qnum=} +\def\qq{\qqq{\number\qnum}} +\def\qqq#1{\bigskip\goodbreak\noindent{\bf#1)}\smallskip} +\def\fr#1#2{{#1\over #2}} +\def\var{\mathop{\rm var}\nolimits} +\def\infint{\int_{-\infty}^\infty} + +\q1 + +The mean of this distribution $\E(X)$ is +$$\E(X) = \infint xf(x) dx.$$ +This integral evaluates to 0 by symmetry because $f(x) = +{1\over2}ce^{-c|x|}$ is an even function, so $xf(x)$ is odd. + +The variance of this distribution is $\var(X) = \E(X^2) - \E(X)^2 = +\E(X^2).$ By theorem 5.58, +$$\E(X^2) = \infint x^2f(x) dx = \infint x^2{1\over2}ce^{-c|x|} dx += \int_0^\infty x^2ce^{-cx} dx += ce^0{2\over c^3} = 2c^{-2}.$$ +by repeated integration by parts + +\q2 + +$$\Pr(X\geq w) = \sum_{k=w}^\infty {1\over k!}\lambda^ke^{-\lambda}.$$ +$$\Pr(Y\leq \lambda) = \int_0^\lambda {1\over\Gamma(w)} x^{w-1}e^{-x}dx += \int_0^\lambda {x^{w-1}e^{-x}\over (w-1)!} dx,$$ +because $\Gamma(w) = (w-1)!$ + +We are going to prove this equality by induction on $w$. +It is true for $w=1$ because the Poisson distribution will sum to +$1-e^{-\lambda}$ (because $\Pr(X\geq0) = 1$ and $\Pr(X<1) = \Pr(X=0) = +e^{-\lambda}\lambda^0/0!.$) +The Gamma distribution is ${1\over\Gamma(1)}\int_0^\lambda x^0e^{-x}dx = +-e^{-x}\big|^\lambda_0 = -e^{-\lambda} + 1.$ + +Assuming the following equality holds for $w,$ it will be shown to +hold for $w+1$: +$$\sum_{k=w}^\infty {1\over k!}\lambda^ke^{-\lambda} = \int_0^\lambda +{x^{w-1}e^{-x}\over (w-1)!} dx.$$ +$$\sum_{k=w}^\infty {1\over k!}\lambda^ke^{-\lambda} += {1\over w!}\lambda^we^{-\lambda} + \sum_{k=w+1}^\infty +{\lambda^ke^{-\lambda}\over k!}.$$ +$${1\over (w-1)!}\int_0^\lambda x^{w-1}e^{-x}dx += {1\over (w-1)!}\left({x^we^{-x}\over w}\big|^\lambda_0 + +\int_0^\lambda {x^we^{-x}\over w} dx\right) = +{\lambda^we^{-\lambda}\over w!} + \int_0^\lambda {x^we^{-x}\over +\Gamma(w+1)} dx$$ +by integration by parts. +$$\Longrightarrow \sum_{k=n+1}^\infty {\lambda^ke^{-\lambda}\over k!} = +\int_0^\lambda {x^we^{-x}\over \Gamma(w+1)} dx.$$ + +QED + +\q3 + +The density function $f(x)$ is proportional, so there is some constant +$c$ such that $$1 = f(x) = c\infint g(x)dx = 2c\int_1^\infty x^{-n}dx = +2c\left(-{x^{1-n}\over n-1}\right)\big|^\infty_1 = 2c\left({1\over +n-1}\right) \to c = {n-1\over2}.$$ + +The mean and variance of $X$ exist when $\E(X)$ and $\E(X^2)$ exist, +respectively. The first exists when $n>2$ because for $n=2,$ +$$\E(X) = \int_1^\infty xx^{-n}dx = \ln(x)\big|^\infty_1,$$ and, +similarly for $\E(X^2)$ under $n\leq3.$ $n>3$ is the condition that it +exists because $$\int_1^\infty x^2x^{-n}dx = \ln(x)\big|^\infty_1,$$ for +$n=3$ (and $\int\ln(x)dx$ for $n=2$). + +\q4 + +The density function of $Y=|X|$ is: +$$\{x\geq0: {2\over\sqrt{2\pi}}\exp(-{1\over2}x^2).\hbox{ 0 +otherwise.}\}$$ + +$$\E(Y) = \int_0^\infty {2x\over \sqrt{2\pi}}\exp(-{1\over2}x^2)dx = +{1\over\sqrt{2\pi}}\int_0^\infty\exp(-\fr u2)du,$$ +by u-substitution, becoming +$$-{2\over\sqrt{2\pi}}e^{-u\over2}\big|_0^\infty = +{\sqrt{2}\over\sqrt{\pi}}.$$ + +Similarly, $\var(x) = \E(Y^2) - \E(Y)^2.$ +$$\E(Y^2) = \int_0^\infty {2x^2\over \sqrt{2\pi}}\exp(-\fr12 x^2)dx += -{2\over\sqrt{2\pi}}x\exp(-\fr12 x^2)\big|^\infty_0 +- {2\over\sqrt{2\pi}}\int_0^\infty -e^{-\fr12 x^2}dx,$$ +by integration by parts, turning into +$$0 + 1,$$ +because the first evaluates to zero at both extrema and the second is +the distribution function of the normal distribution so integrates to 1. + +\bye |